Our new X account is live! Follow @wizwand_team for updates
WorkDL logo mark

Multilingual Neural Machine Translation with Knowledge Distillation

About

Multilingual machine translation, which translates multiple languages with a single model, has attracted much attention due to its efficiency of offline training and online serving. However, traditional multilingual translation usually yields inferior accuracy compared with the counterpart using individual models for each language pair, due to language diversity and model capacity limitations. In this paper, we propose a distillation-based approach to boost the accuracy of multilingual machine translation. Specifically, individual models are first trained and regarded as teachers, and then the multilingual model is trained to fit the training data and match the outputs of individual models simultaneously through knowledge distillation. Experiments on IWSLT, WMT and Ted talk translation datasets demonstrate the effectiveness of our method. Particularly, we show that one model is enough to handle multiple languages (up to 44 languages in our experiment), with comparable or even better accuracy than individual models.

Xu Tan, Yi Ren, Di He, Tao Qin, Zhou Zhao, Tie-Yan Liu• 2019

Related benchmarks

TaskDatasetResultRank
Machine TranslationWMT16 EN-RO (test)
BLEU35.8
56
Machine TranslationWMT En-Fi 17 (test)
BLEU (tokenized)22
14
Many-to-One Multilingual Machine TranslationTED-8-DIVERSE base (test)
BLEU29.52
14
One-to-Many Multilingual Machine TranslationTED-8-DIVERSE base (test)
BLEU22.31
14
Many-to-One Multilingual Machine TranslationWMT-6 base (test)
BLEU20.18
10
One-to-Many Multilingual Machine TranslationWMT-6 base (test)
BLEU18.57
10
Showing 6 of 6 rows

Other info

Follow for update