Our new X account is live! Follow @wizwand_team for updates
WorkDL logo mark

Enhancing Cross-lingual Transfer by Manifold Mixup

About

Based on large-scale pre-trained multilingual representations, recent cross-lingual transfer methods have achieved impressive transfer performances. However, the performance of target languages still lags far behind the source language. In this paper, our analyses indicate such a performance gap is strongly associated with the cross-lingual representation discrepancy. To achieve better cross-lingual transfer performance, we propose the cross-lingual manifold mixup (X-Mixup) method, which adaptively calibrates the representation discrepancy and gives a compromised representation for target languages. Experiments on the XTREME benchmark show X-Mixup achieves 1.8% performance gains on multiple text understanding tasks, compared with strong baselines, and significantly reduces the cross-lingual representation discrepancy.

Huiyun Yang, Huadong Chen, Hao Zhou, Lei Li• 2022

Related benchmarks

TaskDatasetResultRank
Natural Language InferenceXNLI
Accuracy84.62
111
Natural Language UnderstandingNusaX
Macro F179.94
28
Question AnsweringTyDiQA
Exact Match48.23
28
Showing 3 of 3 rows

Other info

Follow for update