Our new X account is live! Follow @wizwand_team for updates
WorkDL logo mark

ConNER: Consistency Training for Cross-lingual Named Entity Recognition

About

Cross-lingual named entity recognition (NER) suffers from data scarcity in the target languages, especially under zero-shot settings. Existing translate-train or knowledge distillation methods attempt to bridge the language gap, but often introduce a high level of noise. To solve this problem, consistency training methods regularize the model to be robust towards perturbations on data or hidden states. However, such methods are likely to violate the consistency hypothesis, or mainly focus on coarse-grain consistency. We propose ConNER as a novel consistency training framework for cross-lingual NER, which comprises of: (1) translation-based consistency training on unlabeled target-language data, and (2) dropoutbased consistency training on labeled source-language data. ConNER effectively leverages unlabeled target-language data and alleviates overfitting on the source language to enhance the cross-lingual adaptability. Experimental results show our ConNER achieves consistent improvement over various baseline methods.

Ran Zhou, Xin Li, Lidong Bing, Erik Cambria, Luo Si, Chunyan Miao• 2022

Related benchmarks

TaskDatasetResultRank
Named Entity RecognitionWikiAnn (test)
Average Accuracy57.76
58
Named Entity RecognitionCoNLL (test)
F1 Score (de)77.14
28
Annotation ProjectionMulti-task sequence labeling dataset (combined)
Average Score48.7
7
Argument MiningArgument Mining (AM) annotation projection dataset (test)
ES0.216
7
Named Entity RecognitionNER (Named Entity Recognition) annotation projection dataset (test)
ES Score68.8
7
Opinion Target ExtractionOTE (Opinion Target Extraction) annotation projection (test)
OTE ES Score66.5
7
Showing 6 of 6 rows

Other info

Follow for update