Our new X account is live! Follow @wizwand_team for updates
WorkDL logo mark

Neural Machine Translation by Jointly Learning to Align and Translate

About

Neural machine translation is a recently proposed approach to machine translation. Unlike the traditional statistical machine translation, the neural machine translation aims at building a single neural network that can be jointly tuned to maximize the translation performance. The models proposed recently for neural machine translation often belong to a family of encoder-decoders and consists of an encoder that encodes a source sentence into a fixed-length vector from which a decoder generates a translation. In this paper, we conjecture that the use of a fixed-length vector is a bottleneck in improving the performance of this basic encoder-decoder architecture, and propose to extend this by allowing a model to automatically (soft-)search for parts of a source sentence that are relevant to predicting a target word, without having to form these parts as a hard segment explicitly. With this new approach, we achieve a translation performance comparable to the existing state-of-the-art phrase-based system on the task of English-to-French translation. Furthermore, qualitative analysis reveals that the (soft-)alignments found by the model agree well with our intuition.

Dzmitry Bahdanau, Kyunghyun Cho, Yoshua Bengio• 2014

Related benchmarks

TaskDatasetResultRank
Multivariate ForecastingETTh1
MSE0.991
645
Multivariate Time-series ForecastingETTm1
MSE0.444
433
Multivariate long-term series forecastingETTh2
MSE1.552
319
Machine TranslationWMT En-Fr 2014 (test)
BLEU28.45
237
Long-term time-series forecastingETTh1 (test)
MSE0.114
221
Hallucination DetectionTriviaQA (test)
AUC-ROC42
169
Machine TranslationIWSLT De-En 2014 (test)
BLEU29.98
146
Multimodal Machine TranslationMulti30K (test)
BLEU-433.7
139
Speech RecognitionWSJ (92-eval)
WER16
131
Scene Text RecognitionSVT 647 (test)
Accuracy85.9
101
Showing 10 of 113 rows
...

Other info

Code

Follow for update