Our new X account is live! Follow @wizwand_team for updates
WorkDL logo mark

Syntax-augmented Multilingual BERT for Cross-lingual Transfer

About

In recent years, we have seen a colossal effort in pre-training multilingual text encoders using large-scale corpora in many languages to facilitate cross-lingual transfer learning. However, due to typological differences across languages, the cross-lingual transfer is challenging. Nevertheless, language syntax, e.g., syntactic dependencies, can bridge the typological gap. Previous works have shown that pre-trained multilingual encoders, such as mBERT \cite{devlin-etal-2019-bert}, capture language syntax, helping cross-lingual transfer. This work shows that explicitly providing language syntax and training mBERT using an auxiliary objective to encode the universal dependency tree structure helps cross-lingual transfer. We perform rigorous experiments on four NLP tasks, including text classification, question answering, named entity recognition, and task-oriented semantic parsing. The experiment results show that syntax-augmented mBERT improves cross-lingual transfer on popular benchmarks, such as PAWS-X and MLQA, by 1.4 and 1.6 points on average across all languages. In the \emph{generalized} transfer setting, the performance boosted significantly, with 3.9 and 3.1 points on average in PAWS-X and MLQA.

Wasi Uddin Ahmad, Haoran Li, Kai-Wei Chang, Yashar Mehdad• 2021

Related benchmarks

TaskDatasetResultRank
Natural Language InferenceXNLI (test)
Average Accuracy68.5
167
Named Entity RecognitionWikiAnn (test)
Average Accuracy69
58
Question AnsweringMLQA (test)--
35
Named Entity RecognitionCoNLL (test)
F1 Score (de)69.1
28
Semantic ParsingmTOP (test)
Average Score41.4
17
Paraphrase IdentificationPAWS-X (test)
Accuracy (en)94
13
Question AnsweringXQuAD (test)--
9
Semantic ParsingmATIS++ (test)
Score (en)86.2
2
Showing 8 of 8 rows

Other info

Code

Follow for update