Our new X account is live! Follow @wizwand_team for updates
WorkDL logo mark

Counter-Interference Adapter for Multilingual Machine Translation

About

Developing a unified multilingual model has long been a pursuit for machine translation. However, existing approaches suffer from performance degradation -- a single multilingual model is inferior to separately trained bilingual ones on rich-resource languages. We conjecture that such a phenomenon is due to interference caused by joint training with multiple languages. To accommodate the issue, we propose CIAT, an adapted Transformer model with a small parameter overhead for multilingual machine translation. We evaluate CIAT on multiple benchmark datasets, including IWSLT, OPUS-100, and WMT. Experiments show that CIAT consistently outperforms strong multilingual baselines on 64 of total 66 language directions, 42 of which see above 0.5 BLEU improvement. Our code is available at \url{https://github.com/Yaoming95/CIAT}~.

Yaoming Zhu, Jiangtao Feng, Chengqi Zhao, Mingxuan Wang, Lei Li• 2021

Related benchmarks

TaskDatasetResultRank
Medical Image ClassificationCovid (test)
Accuracy95.2
43
Chest X-ray classificationTuberculosis (test)
Accuracy99
23
Medical Image ClassificationCell (test)
Accuracy93.6
14
Medical Image ClassificationBrain (test)
Accuracy95.2
14
Medical Image ClassificationBUSI (test)
Accuracy88.4
14
Showing 5 of 5 rows

Other info

Follow for update