Our new X account is live! Follow @wizwand_team for updates
WorkDL logo mark

On the Properties of Neural Machine Translation: Encoder-Decoder Approaches

About

Neural machine translation is a relatively new approach to statistical machine translation based purely on neural networks. The neural machine translation models often consist of an encoder and a decoder. The encoder extracts a fixed-length representation from a variable-length input sentence, and the decoder generates a correct translation from this representation. In this paper, we focus on analyzing the properties of the neural machine translation using two models; RNN Encoder--Decoder and a newly proposed gated recursive convolutional neural network. We show that the neural machine translation performs relatively well on short sentences without unknown words, but its performance degrades rapidly as the length of the sentence and the number of unknown words increase. Furthermore, we find that the proposed gated recursive convolutional network learns a grammatical structure of a sentence automatically.

Kyunghyun Cho, Bart van Merrienboer, Dzmitry Bahdanau, Yoshua Bengio• 2014

Related benchmarks

TaskDatasetResultRank
Subjectivity ClassificationSubj
Accuracy89.5
266
Question ClassificationTREC
Accuracy88.4
205
Long-range sequence modelingLong Range Arena (LRA)
Text Accuracy86.7
164
Opinion Polarity DetectionMPQA
Accuracy84.5
154
Sentiment ClassificationMR
Accuracy76.3
148
Sentiment ClassificationCR
Accuracy81.3
142
Traffic PredictionLos-loop
RMSE5.2182
65
Traffic PredictionSZ-taxi
RMSE3.9994
65
Mortality PredictionMIMIC-IV (test)
AUC63.33
43
Code SearchPython (test)
Recall@111.1
25
Showing 10 of 44 rows

Other info

Follow for update