Our new X account is live! Follow @wizwand_team for updates
WorkDL logo mark

Neural Architectures for Named Entity Recognition

About

State-of-the-art named entity recognition systems rely heavily on hand-crafted features and domain-specific knowledge in order to learn effectively from the small, supervised training corpora that are available. In this paper, we introduce two new neural architectures---one based on bidirectional LSTMs and conditional random fields, and the other that constructs and labels segments using a transition-based approach inspired by shift-reduce parsers. Our models rely on two sources of information about words: character-based word representations learned from the supervised corpus and unsupervised word representations learned from unannotated corpora. Our models obtain state-of-the-art performance in NER in four languages without resorting to any language-specific knowledge or resources such as gazetteers.

Guillaume Lample, Miguel Ballesteros, Sandeep Subramanian, Kazuya Kawakami, Chris Dyer• 2016

Related benchmarks

TaskDatasetResultRank
Named Entity RecognitionCoNLL 2003 (test)
F1 Score91.14
539
Nested Named Entity RecognitionACE 2004 (test)
F1 Score58.3
166
Nested Named Entity RecognitionACE 2005 (test)
F1 Score57.6
153
Named Entity RecognitionCoNLL English 2003 (test)
F1 Score91.13
135
Named Entity RecognitionCoNLL 03
F1 (Entity)90.94
102
Named Entity RecognitionCoNLL Spanish NER 2002 (test)
F1 Score86.12
98
ChunkingCoNLL 2000 (test)
F1 Score94.97
88
Named Entity RecognitionCoNLL Dutch 2002 (test)
F1 Score87.13
87
Named Entity RecognitionConll 2003
F1 Score90.94
86
Named Entity RecognitionCoNLL German 2003 (test)
F1 Score78.8
78
Showing 10 of 53 rows

Other info

Code

Follow for update