Our new X account is live! Follow @wizwand_team for updates
WorkDL logo mark

An Effective Approach to Unsupervised Machine Translation

About

While machine translation has traditionally relied on large amounts of parallel corpora, a recent research line has managed to train both Neural Machine Translation (NMT) and Statistical Machine Translation (SMT) systems using monolingual corpora only. In this paper, we identify and address several deficiencies of existing unsupervised SMT approaches by exploiting subword information, developing a theoretically well founded unsupervised tuning method, and incorporating a joint refinement procedure. Moreover, we use our improved SMT system to initialize a dual NMT model, which is further fine-tuned through on-the-fly back-translation. Together, we obtain large improvements over the previous state-of-the-art in unsupervised machine translation. For instance, we get 22.5 BLEU points in English-to-German WMT 2014, 5.5 points more than the previous best unsupervised system, and 0.5 points more than the (supervised) shared task winner back in 2014.

Mikel Artetxe, Gorka Labaka, Eneko Agirre• 2019

Related benchmarks

TaskDatasetResultRank
Machine TranslationWMT 2014 (test)
BLEU36.2
100
Machine Translation (tr->en)TED
ChrF++0.087
7
Machine Translation (tr->en)FLORES
ChrF++ Score9.3
7
Machine Translation (tr->en)Tatoeba
ChrF++8.1
7
Machine Translation (tr->en)TED (test)
BLEU Score0.2
7
Machine Translation (tr->en)Flores (test)
BLEU0.1
7
Machine Translation (tr->en)Tatoeba (test)
BLEU0.1
7
Machine Translation (en->tr)TED
ChrF++10.8
6
Machine Translation (en->tr)FLORES
ChrF++12.1
6
Machine Translation (en->tr)Tatoeba
ChrF++12.5
6
Showing 10 of 13 rows

Other info

Code

Follow for update