Our new X account is live! Follow @wizwand_team for updates
WorkDL logo mark

TENER: Adapting Transformer Encoder for Named Entity Recognition

About

The Bidirectional long short-term memory networks (BiLSTM) have been widely used as an encoder in models solving the named entity recognition (NER) task. Recently, the Transformer is broadly adopted in various Natural Language Processing (NLP) tasks owing to its parallelism and advantageous performance. Nevertheless, the performance of the Transformer in NER is not as good as it is in other NLP tasks. In this paper, we propose TENER, a NER architecture adopting adapted Transformer Encoder to model the character-level features and word-level features. By incorporating the direction and relative distance aware attention and the un-scaled attention, we prove the Transformer-like encoder is just as effective for NER as other NLP tasks.

Hang Yan, Bocao Deng, Xiaonan Li, Xipeng Qiu• 2019

Related benchmarks

TaskDatasetResultRank
Named Entity RecognitionCoNLL English 2003 (test)--
135
Named Entity RecognitionOntoNotes
F1-score89.78
91
Named Entity RecognitionWNUT 2017 (test)
F1 Score48.98
63
Named Entity RecognitionMSRA (test)
F1 Score92.74
63
Named Entity RecognitionOntoNotes 4.0 (test)
F1 Score72.43
55
Named Entity RecognitionRESUME
F1 Score95
52
Named Entity RecognitionWeibo (test)--
50
Named Entity RecognitionMSRA
F1 Score92.74
29
Named Entity RecognitionResume (test)
F1 Score95
28
Named Entity RecognitionWeibo
F1 Score58.17
27
Showing 10 of 14 rows

Other info

Code

Follow for update