Our new X account is live! Follow @wizwand_team for updates
WorkDL logo mark

Unsupervised Neural Machine Translation with Weight Sharing

About

Unsupervised neural machine translation (NMT) is a recently proposed approach for machine translation which aims to train the model without using any labeled data. The models proposed for unsupervised NMT often use only one shared encoder to map the pairs of sentences from different languages to a shared-latent space, which is weak in keeping the unique and internal characteristics of each language, such as the style, terminology, and sentence structure. To address this issue, we introduce an extension by utilizing two independent encoders but sharing some partial weights which are responsible for extracting high-level representations of the input sentences. Besides, two different generative adversarial networks (GANs), namely the local GAN and global GAN, are proposed to enhance the cross-language translation. With this new approach, we achieve significant improvements on English-German, English-French and Chinese-to-English translation tasks.

Zhen Yang, Wei Chen, Feng Wang, Bo Xu• 2018

Related benchmarks

TaskDatasetResultRank
Machine TranslationWMT En-Fr 2014 (test)
BLEU16.97
237
Machine TranslationWMT16 English-German (test)
BLEU10.86
58
Machine TranslationWMT16 German-English (test)
BLEU14.62
39
Machine TranslationWMT En-Fr (newstest2014)
BLEU16.97
9
Machine TranslationWMT en-de 2016 (newstest)
BLEU10.86
9
Machine Translation (De-En)WMT 2016 (test)
BLEU14.62
9
Machine TranslationWMT fr-en 2014 (newstest)
BLEU15.58
6
Machine TranslationWMT de-en 2016 (newstest)
BLEU14.62
6
Machine TranslationLDC Chinese-English (test)
BLEU14.52
3
Showing 9 of 9 rows

Other info

Code

Follow for update