Our new X account is live! Follow @wizwand_team for updates
WorkDL logo mark

Skip-Thought Vectors

About

We describe an approach for unsupervised learning of a generic, distributed sentence encoder. Using the continuity of text from books, we train an encoder-decoder model that tries to reconstruct the surrounding sentences of an encoded passage. Sentences that share semantic and syntactic properties are thus mapped to similar vector representations. We next introduce a simple vocabulary expansion method to encode words that were not seen as part of training, allowing us to expand our vocabulary to a million words. After training our model, we extract and evaluate our vectors with linear models on 8 tasks: semantic relatedness, paraphrase detection, image-sentence ranking, question-type classification and 4 benchmark sentiment and subjectivity datasets. The end result is an off-the-shelf encoder that can produce highly generic sentence representations that are robust and perform well in practice. We will make our encoder publicly available.

Ryan Kiros, Yukun Zhu, Ruslan Salakhutdinov, Richard S. Zemel, Antonio Torralba, Raquel Urtasun, Sanja Fidler• 2015

Related benchmarks

TaskDatasetResultRank
Natural Language InferenceSNLI (test)
Accuracy87.7
681
Subjectivity ClassificationSubj
Accuracy94.2
266
Question ClassificationTREC
Accuracy92.2
205
Text ClassificationTREC
Accuracy93
179
Opinion Polarity DetectionMPQA
Accuracy89.3
154
Sentiment ClassificationMR
Accuracy76.5
148
Sentiment ClassificationIMDB (test)
Error Rate17.42
144
Sentiment ClassificationCR
Accuracy83.8
142
Subjectivity ClassificationSubj (test)
Accuracy93.6
125
Question ClassificationTREC (test)
Accuracy92.2
124
Showing 10 of 51 rows

Other info

Follow for update