Our new X account is live! Follow @wizwand_team for updates
WorkDL logo mark

Neural Machine Translation with Monolingual Translation Memory

About

Prior work has proved that Translation memory (TM) can boost the performance of Neural Machine Translation (NMT). In contrast to existing work that uses bilingual corpus as TM and employs source-side similarity search for memory retrieval, we propose a new framework that uses monolingual memory and performs learnable memory retrieval in a cross-lingual manner. Our framework has unique advantages. First, the cross-lingual memory retriever allows abundant monolingual data to be TM. Second, the memory retriever and NMT model can be jointly optimized for the ultimate translation goal. Experiments show that the proposed method obtains substantial improvements. Remarkably, it even outperforms strong TM-augmented NMT baselines using bilingual TM. Owning to the ability to leverage monolingual data, our model also demonstrates effectiveness in low-resource and domain adaptation scenarios.

Deng Cai, Yan Wang, Huayang Li, Wai Lam, Lemao Liu• 2021

Related benchmarks

TaskDatasetResultRank
Machine TranslationJRC-Acquis En-Es (dev)
BLEU64.18
18
Machine TranslationJRC-Acquis Es-En (dev)
BLEU67.73
18
Machine TranslationJRC-Acquis De-En (dev)
BLEU64.48
18
Machine TranslationJRC-Acquis En-De (dev)
BLEU58.77
18
Machine TranslationJRC-Acquis En-De (test)
BLEU58.42
18
Machine Translation (De to En)JRC-Acquis high-resource (test)
BLEU64.62
16
Machine Translation (Es to En)JRC-Acquis high-resource (test)
BLEU67.42
16
Machine Translation (En to Es)JRC-Acquis high-resource (test)
BLEU63.86
16
Machine TranslationMulti-Domain (test)--
15
Machine Translation (En to De)JRC-Acquis high-resource (test)
BLEU57.79
5
Showing 10 of 14 rows

Other info

Follow for update