Share your thoughts, 1 month free Claude Pro on usSee more
WorkDL logo mark

Learning Phrase Representations using RNN Encoder-Decoder for Statistical Machine Translation

About

In this paper, we propose a novel neural network model called RNN Encoder-Decoder that consists of two recurrent neural networks (RNN). One RNN encodes a sequence of symbols into a fixed-length vector representation, and the other decodes the representation into another sequence of symbols. The encoder and decoder of the proposed model are jointly trained to maximize the conditional probability of a target sequence given a source sequence. The performance of a statistical machine translation system is empirically found to improve by using the conditional probabilities of phrase pairs computed by the RNN Encoder-Decoder as an additional feature in the existing log-linear model. Qualitatively, we show that the proposed model learns a semantically and syntactically meaningful representation of linguistic phrases.

Kyunghyun Cho, Bart van Merrienboer, Caglar Gulcehre, Dzmitry Bahdanau, Fethi Bougares, Holger Schwenk, Yoshua Bengio• 2014

Related benchmarks

TaskDatasetResultRank
Language ModelingWikiText-2 (test)
PPL71.8
1949
Language ModelingPenn Treebank (test)
Perplexity66.3
411
Time Series ForecastingETTm2
MSE2.041
382
Subjectivity ClassificationSubj
Accuracy90.5
329
Time Series ForecastingWeather
MSE0.369
295
Question ClassificationTREC
Accuracy82.8
259
Language ModelingWikiText-103 (val)
PPL689.5
214
Text ClassificationTREC
Accuracy89.6
207
Time Series ForecastingExchange
MSE1.453
199
Sentiment AnalysisSST-5 (test)
Accuracy49.5
173
Showing 10 of 152 rows
...

Other info

Follow for update