Our new X account is live! Follow @wizwand_team for updates
WorkDL logo mark

Transfer Learning for Sequence Tagging with Hierarchical Recurrent Networks

About

Recent papers have shown that neural networks obtain state-of-the-art performance on several different sequence tagging tasks. One appealing property of such systems is their generality, as excellent performance can be achieved with a unified architecture and without task-specific feature engineering. However, it is unclear if such systems can be used for tasks without large amounts of training data. In this paper we explore the problem of transfer learning for neural sequence taggers, where a source task with plentiful annotations (e.g., POS tagging on Penn Treebank) is used to improve performance on a target task with fewer available annotations (e.g., POS tagging for microblogs). We examine the effects of transfer learning for deep hierarchical recurrent networks across domains, applications, and languages, and show that significant improvement can often be obtained. These improvements lead to improvements over the current state-of-the-art on several well-studied tasks.

Zhilin Yang, Ruslan Salakhutdinov, William W. Cohen• 2017

Related benchmarks

TaskDatasetResultRank
Named Entity RecognitionCoNLL 2003 (test)
F1 Score91.26
539
Named Entity RecognitionCoNLL English 2003 (test)
F1 Score91.2
135
Named Entity RecognitionCoNLL Spanish NER 2002 (test)
F1 Score85.77
98
ChunkingCoNLL 2000 (test)
F1 Score95.41
88
Named Entity RecognitionCoNLL Dutch 2002 (test)
F1 Score85
87
Named Entity RecognitionConll 2003
F1 Score91.26
86
Part-of-Speech TaggingPenn Treebank (test)
Accuracy97.55
64
Part-of-Speech TaggingWSJ (test)
Accuracy97.55
51
POS TaggingPTB (test)
Accuracy97.55
24
Part-of-Speech TaggingPenn Treebank POS (test)
F1 Score97.55
10
Showing 10 of 13 rows

Other info

Code

Follow for update