Our new X account is live! Follow @wizwand_team for updates
WorkDL logo mark

Unsupervised Machine Translation Using Monolingual Corpora Only

About

Machine translation has recently achieved impressive performance thanks to recent advances in deep learning and the availability of large-scale parallel corpora. There have been numerous attempts to extend these successes to low-resource language pairs, yet requiring tens of thousands of parallel sentences. In this work, we take this research direction to the extreme and investigate whether it is possible to learn to translate even without any parallel data. We propose a model that takes sentences from monolingual corpora in two different languages and maps them into the same latent space. By learning to reconstruct in both languages from this shared feature space, the model effectively learns to translate without using any labeled data. We demonstrate our model on two widely used datasets and two language pairs, reporting BLEU scores of 32.8 and 15.1 on the Multi30k and WMT English-French datasets, without using even a single parallel sentence at training time.

Guillaume Lample, Alexis Conneau, Ludovic Denoyer, Marc'Aurelio Ranzato• 2017

Related benchmarks

TaskDatasetResultRank
Machine TranslationWMT En-Fr 2014 (test)
BLEU15.05
237
Machine TranslationWMT16 English-German (test)
BLEU9.64
58
Machine TranslationWMT16 German-English (test)
BLEU13.33
39
Machine TranslationMulti30k Task1 (en-de)
BLEU Score22.74
26
Machine TranslationMulti30k Task1 en-fr
BLEU Score32.76
25
Machine TranslationMulti30k M30kT (test)
BLEU Score32.07
19
Machine TranslationWMT19 English-German (En-De) (test)
BLEU26.7
19
Machine TranslationEnglish-Belarusian (BE)
BLEU140
15
Machine TranslationEnglish-Ukrainian
BLEU0.9
15
Machine TranslationWMT en-de
BLEU9.75
10
Showing 10 of 32 rows

Other info

Follow for update