Our new X account is live! Follow @wizwand_team for updates
WorkDL logo mark

Better Feature Integration for Named Entity Recognition

About

It has been shown that named entity recognition (NER) could benefit from incorporating the long-distance structured information captured by dependency trees. We believe this is because both types of features - the contextual information captured by the linear sequences and the structured information captured by the dependency trees may complement each other. However, existing approaches largely focused on stacking the LSTM and graph neural networks such as graph convolutional networks (GCNs) for building improved NER models, where the exact interaction mechanism between the two types of features is not very clear, and the performance gain does not appear to be significant. In this work, we propose a simple and robust solution to incorporate both types of features with our Synergized-LSTM (Syn-LSTM), which clearly captures how the two types of features interact. We conduct extensive experiments on several standard datasets across four languages. The results demonstrate that the proposed model achieves better performance than previous approaches while requiring fewer parameters. Our further analysis demonstrates that our model can capture longer dependencies compared with strong baselines.

Lu Xu, Zhanming Jie, Wei Lu, Lidong Bing• 2021

Related benchmarks

TaskDatasetResultRank
Named Entity RecognitionConll 2003
F1 Score92.52
86
Named Entity RecognitionWnut 2017
F1 Score59.58
79
Named Entity RecognitionWeiboNER
F1 Score70.28
27
Sequence LabelingRestaurant 16
F1 Score76.9
20
Sequence LabelingRestaurant15
F1 Score68.54
20
Sequence LabelingLaptop14
F1 Score69.52
20
Sequence LabelingRestaurant 14
F1 Score77.52
20
Named Entity RecognitionOntoNotes English
Precision90.14
5
Showing 8 of 8 rows

Other info

Follow for update