Our new X account is live! Follow @wizwand_team for updates
WorkDL logo mark

Siamese CBOW: Optimizing Word Embeddings for Sentence Representations

About

We present the Siamese Continuous Bag of Words (Siamese CBOW) model, a neural network for efficient estimation of high-quality sentence embeddings. Averaging the embeddings of words in a sentence has proven to be a surprisingly successful and efficient way of obtaining sentence embeddings. However, word embeddings trained with the methods currently available are not optimized for the task of sentence representation, and, thus, likely to be suboptimal. Siamese CBOW handles this problem by training word embeddings directly for the purpose of being averaged. The underlying neural network learns word embeddings by predicting, from a sentence representation, its surrounding sentences. We show the robustness of the Siamese CBOW model by evaluating it on 20 datasets stemming from a wide variety of sources.

Tom Kenter, Alexey Borisov, Maarten de Rijke• 2016

Related benchmarks

TaskDatasetResultRank
Sentence RelatednessSTS 2014
News Spearman0.58
30
Showing 1 of 1 rows

Other info

Follow for update