Our new X account is live! Follow @wizwand_team for updates
WorkDL logo mark

Incorporating BERT into Neural Machine Translation

About

The recently proposed BERT has shown great power on a variety of natural language understanding tasks, such as text classification, reading comprehension, etc. However, how to effectively apply BERT to neural machine translation (NMT) lacks enough exploration. While BERT is more commonly used as fine-tuning instead of contextual embedding for downstream language understanding tasks, in NMT, our preliminary exploration of using BERT as contextual embedding is better than using for fine-tuning. This motivates us to think how to better leverage BERT for NMT along this direction. We propose a new algorithm named BERT-fused model, in which we first use BERT to extract representations for an input sequence, and then the representations are fused with each layer of the encoder and decoder of the NMT model through attention mechanisms. We conduct experiments on supervised (including sentence-level and document-level translations), semi-supervised and unsupervised machine translation, and achieve state-of-the-art results on seven benchmark datasets. Our code is available at \url{https://github.com/bert-nmt/bert-nmt}.

Jinhua Zhu, Yingce Xia, Lijun Wu, Di He, Tao Qin, Wengang Zhou, Houqiang Li, Tie-Yan Liu• 2020

Related benchmarks

TaskDatasetResultRank
Machine TranslationWMT En-De 2014 (test)
BLEU30.75
379
Machine TranslationWMT En-Fr 2014 (test)
BLEU43.78
237
Machine TranslationIWSLT De-En 2014 (test)
BLEU36.69
146
Multimodal Machine TranslationMulti30K (test)--
139
Machine TranslationWMT 2014 (test)
BLEU30.75
100
Machine TranslationIWSLT En-De 2014 (test)
BLEU31.02
92
Multimodal Machine TranslationMulti30k En-De 2017 (test)
METEOR60.8
45
Machine TranslationWMT En-De (newstest2014)
BLEU30.75
43
Machine TranslationWMT14 English-French (newstest2014)
BLEU43.78
39
Machine TranslationIWSLT De-En 14
BLEU Score36.11
33
Showing 10 of 30 rows

Other info

Code

Follow for update