Our new X account is live! Follow @wizwand_team for updates
WorkDL logo mark

An efficient framework for learning sentence representations

About

In this work we propose a simple and efficient framework for learning sentence representations from unlabelled data. Drawing inspiration from the distributional hypothesis and recent work on learning sentence representations, we reformulate the problem of predicting the context in which a sentence appears as a classification problem. Given a sentence and its context, a classifier distinguishes context sentences from other contrastive sentences based on their vector representations. This allows us to efficiently learn different types of encoding functions, and we show that the model learns high-quality sentence representations. We demonstrate that our sentence representations outperform state-of-the-art unsupervised and supervised representation learning methods on several downstream NLP tasks that involve understanding sentence semantics while achieving an order of magnitude speedup in training time.

Lajanugen Logeswaran, Honglak Lee• 2018

Related benchmarks

TaskDatasetResultRank
Subjectivity ClassificationSubj
Accuracy94.8
266
Text ClassificationTREC
Accuracy92.4
179
Sentiment ClassificationCR
Accuracy86
142
Text ClassificationMR
Accuracy82.4
93
Sentence Embedding EvaluationSentEval--
44
Text ClassificationSST binary
Accuracy87.6
29
Sentence Representation EvaluationSentEval (test)
MR Accuracy81.3
28
Text ClassificationMPQA
Accuracy90.2
25
Linguistic ProbingSentEval
BShift56.8
9
Showing 9 of 9 rows

Other info

Follow for update