Our new X account is live! Follow @wizwand_team for updates
WorkDL logo mark

Learning Generic Sentence Representations Using Convolutional Neural Networks

About

We propose a new encoder-decoder approach to learn distributed sentence representations that are applicable to multiple purposes. The model is learned by using a convolutional neural network as an encoder to map an input sentence into a continuous vector, and using a long short-term memory recurrent neural network as a decoder. Several tasks are considered, including sentence reconstruction and future sentence prediction. Further, a hierarchical encoder-decoder model is proposed to encode a sentence to predict multiple future sentences. By training our models on a large collection of novels, we obtain a highly generic convolutional sentence encoder that performs well in practice. Experimental results on several benchmark datasets, and across a broad range of applications, demonstrate the superiority of the proposed model over competing methods.

Zhe Gan, Yunchen Pu, Ricardo Henao, Chunyuan Li, Xiaodong He, Lawrence Carin• 2016

Related benchmarks

TaskDatasetResultRank
Subjectivity ClassificationSubj
Accuracy93.6
266
Text ClassificationTREC
Accuracy92.6
179
Sentiment ClassificationCR
Accuracy82
142
Text ClassificationMR
Accuracy77.8
93
Text ClassificationMPQA
Accuracy89.4
25
Showing 5 of 5 rows

Other info

Follow for update