Our new X account is live! Follow @wizwand_team for updates
WorkDL logo mark

DiffCSE: Difference-based Contrastive Learning for Sentence Embeddings

About

We propose DiffCSE, an unsupervised contrastive learning framework for learning sentence embeddings. DiffCSE learns sentence embeddings that are sensitive to the difference between the original sentence and an edited sentence, where the edited sentence is obtained by stochastically masking out the original sentence and then sampling from a masked language model. We show that DiffSCE is an instance of equivariant contrastive learning (Dangovski et al., 2021), which generalizes contrastive learning and learns representations that are insensitive to certain types of augmentations and sensitive to other "harmful" types of augmentations. Our experiments show that DiffCSE achieves state-of-the-art results among unsupervised sentence representation learning methods, outperforming unsupervised SimCSE by 2.3 absolute points on semantic textual similarity tasks.

Yung-Sung Chuang, Rumen Dangovski, Hongyin Luo, Yang Zhang, Shiyu Chang, Marin Solja\v{c}i\'c, Shang-Wen Li, Wen-tau Yih, Yoon Kim, James Glass• 2022

Related benchmarks

TaskDatasetResultRank
Semantic Textual SimilaritySTS tasks (STS12, STS13, STS14, STS15, STS16, STS-B, SICK-R) various (test)
STS12 Score72.28
393
Semantic Textual SimilaritySTS tasks (STS12, STS13, STS14, STS15, STS16, STS-B, SICK-R)
STS12 Score72.28
195
Sentence Classification Transfer TasksSentEval transfer tasks
Average Accuracy0.8704
99
RetrievalMS MARCO (dev)
MRR@100.3202
84
Semantic Textual SimilaritySTS (Semantic Textual Similarity) 2012-2016 (test)
STS-12 Score72.28
57
Sentence Embedding EvaluationSentEval
Average Score (Avg)87.04
44
Dense RetrievalBEIR zero-shot
TREC-COVID49.2
13
Semantic Textual SimilarityMS-COCO CxC annotations 5k (test)
Spearman's R (Avg)0.701
11
Information RetrievalNatural Question
Recall@1078.53
9
Dense RetrievalNatural Question (test)
Recall@1073.93
9
Showing 10 of 18 rows

Other info

Code

Follow for update