Our new X account is live! Follow @wizwand_team for updates
WorkDL logo mark

Effective Approaches to Attention-based Neural Machine Translation

About

An attentional mechanism has lately been used to improve neural machine translation (NMT) by selectively focusing on parts of the source sentence during translation. However, there has been little work exploring useful architectures for attention-based NMT. This paper examines two simple and effective classes of attentional mechanism: a global approach which always attends to all source words and a local one that only looks at a subset of source words at a time. We demonstrate the effectiveness of both approaches over the WMT translation tasks between English and German in both directions. With local attention, we achieve a significant gain of 5.0 BLEU points over non-attentional systems which already incorporate known techniques such as dropout. Our ensemble model using different attention architectures has established a new state-of-the-art result in the WMT'15 English to German translation task with 25.9 BLEU points, an improvement of 1.0 BLEU points over the existing best system backed by NMT and an n-gram reranker.

Minh-Thang Luong, Hieu Pham, Christopher D. Manning• 2015

Related benchmarks

TaskDatasetResultRank
Machine TranslationWMT En-De 2014 (test)
BLEU20.9
379
Multimodal Machine TranslationMulti30K (test)
BLEU-437.7
139
Machine TranslationWMT English-German 2014 (test)
BLEU23
136
Text SummarizationDUC 2004 (test)
ROUGE-128.55
115
Sign Language TranslationCSL-Daily (test)
BLEU-47.56
99
Text SummarizationGigaword (test)
ROUGE-133.1
75
Machine Translation (Chinese-to-English)NIST 2003 (MT-03)
BLEU36.44
52
Sign Language TranslationPHOENIX14T (test)
BLEU-49
50
Machine TranslationWMT En-De (newstest2014)
BLEU20.9
43
Story Ending GenerationROCStories (test)
BLEU-119.1
43
Showing 10 of 35 rows

Other info

Code

Follow for update