Our new X account is live! Follow @wizwand_team for updates
WorkDL logo mark

Document-Level Neural Machine Translation with Hierarchical Attention Networks

About

Neural Machine Translation (NMT) can be improved by including document-level contextual information. For this purpose, we propose a hierarchical attention model to capture the context in a structured and dynamic manner. The model is integrated in the original NMT architecture as another level of abstraction, conditioning on the NMT model's own previous hidden states. Experiments show that hierarchical attention significantly improves the BLEU score over a strong NMT baseline with the state-of-the-art in context-aware methods, and that both the encoder and decoder benefit from context in complementary ways.

Lesly Miculicich, Dhananjay Ram, Nikolaos Pappas, James Henderson• 2018

Related benchmarks

TaskDatasetResultRank
Machine TranslationIWSLT De-En 2014 (test)
BLEU33.97
146
Machine TranslationIWSLT En-De 2014 (test)
BLEU27.94
92
Machine TranslationEn -> De (test)
BLEU Score33.16
23
English-German document-level translationNews English-German (test)
s-BLEU25.03
20
English-German document-level translationTED English-German (test)
s-BLEU0.2458
20
English-German document-level translationEuroparl English-German (test)
s-BLEU28.6
20
Machine Translationen-fr (test)
BLEU41.95
17
Document-Level Machine TranslationTED15 Zh-En 2010-2013 (test)
d-BLEU24
16
Machine TranslationEn-Ru (test)
BLEU31.23
14
Contrastive Translation EvaluationContraPro En-Fr
Accuracy84.32
9
Showing 10 of 12 rows

Other info

Follow for update