Our new X account is live! Follow @wizwand_team for updates
WorkDL logo mark

Linguistic Input Features Improve Neural Machine Translation

About

Neural machine translation has recently achieved impressive results, while using little in the way of external linguistic information. In this paper we show that the strong learning capability of neural MT models does not make linguistic features redundant; they can be easily incorporated to provide further improvements in performance. We generalize the embedding layer of the encoder in the attentional encoder--decoder architecture to support the inclusion of arbitrary features, in addition to the baseline word feature. We add morphological features, part-of-speech tags, and syntactic dependency labels as input features to English<->German, and English->Romanian neural machine translation systems. In experiments on WMT16 training and test sets, we find that linguistic input features improve model quality according to three metrics: perplexity, BLEU and CHRF3. An open-source implementation of our neural MT system is available, as are sample files and configurations.

Rico Sennrich, Barry Haddow• 2016

Related benchmarks

TaskDatasetResultRank
Machine TranslationEnglish-Romanian 2016 (test)
BLEU29.2
12
Machine TranslationWMT'15 German-English (test)
BLEU32.1
11
Machine TranslationGerman-English newstest 2016 (test)
BLEU38.5
10
Machine TranslationGerman-English newstest 2013 (dev)
Perplexity44.1
8
Machine Translationnewstest English-German 2015 (test)
BLEU28.7
8
Machine TranslationEnglish-Romanian news 2016 (dev)
Perplexity50.1
4
Machine Translation (English to German)newstest 2016 (test)
BLEU28.4
2
Showing 7 of 7 rows

Other info

Code

Follow for update