Our new X account is live! Follow @wizwand_team for updates
WorkDL logo mark

Zero-Resource Cross-Lingual Named Entity Recognition

About

Recently, neural methods have achieved state-of-the-art (SOTA) results in Named Entity Recognition (NER) tasks for many languages without the need for manually crafted features. However, these models still require manually annotated training data, which is not available for many languages. In this paper, we propose an unsupervised cross-lingual NER model that can transfer NER knowledge from one language to another in a completely unsupervised way without relying on any bilingual dictionary or parallel data. Our model achieves this through word-level adversarial learning and augmented fine-tuning with parameter sharing and feature augmentation. Experiments on five different languages demonstrate the effectiveness of our approach, outperforming existing models by a good margin and setting a new SOTA for each language pair.

M Saiful Bari, Shafiq Joty, Prathyusha Jwalapuram• 2019

Related benchmarks

TaskDatasetResultRank
Named Entity RecognitionCoNLL Spanish NER 2002 (test)
F1 Score75.93
98
Named Entity RecognitionCoNLL Dutch 2002 (test)
F1 Score74.61
87
Named Entity RecognitionCoNLL German 2003 (test)
F1 Score65.24
78
Named Entity RecognitionCoNLL NER 2002/2003 (test)
German F1 Score65.24
59
Named Entity RecognitionSpanish (test)--
15
Named Entity RecognitionDutch (test)--
15
Named Entity RecognitionEnglish-to-Spanish en-es
F1 Score75.93
12
Named Entity RecognitionEnglish-to-Dutch en-nl
F1 Score74.61
12
Named Entity RecognitionEnglish-to-German en-de
F1 Score65.24
12
Named Entity RecognitionCoNLL de 2003 (test)
F1 Score65.24
12
Showing 10 of 16 rows

Other info

Code

Follow for update