Our new X account is live! Follow @wizwand_team for updates
WorkDL logo mark

SCD: Self-Contrastive Decorrelation for Sentence Embeddings

About

In this paper, we propose Self-Contrastive Decorrelation (SCD), a self-supervised approach. Given an input sentence, it optimizes a joint self-contrastive and decorrelation objective. Learning a representation is facilitated by leveraging the contrast arising from the instantiation of standard dropout at different rates. The proposed method is conceptually simple yet empirically powerful. It achieves comparable results with state-of-the-art methods on multiple benchmarks without using contrastive pairs. This study opens up avenues for efficient self-supervised learning methods that are more robust than current contrastive methods.

Tassilo Klein, Moin Nabi• 2022

Related benchmarks

TaskDatasetResultRank
Semantic Textual SimilaritySTS tasks (STS12, STS13, STS14, STS15, STS16, STS-B, SICK-R) various (test)
STS12 Score66.94
393
Subjectivity ClassificationSubj
Accuracy99.56
266
Question ClassificationTREC
Accuracy89.8
205
Opinion Polarity DetectionMPQA
Accuracy88.67
154
Sentiment ClassificationMR
Accuracy82.17
148
Sentiment ClassificationCR
Accuracy87.76
142
Sentiment ClassificationSST
Accuracy88.19
24
Paraphrase DetectionMicrosoft Paraphrase Corpus
Accuracy75.71
21
Showing 8 of 8 rows

Other info

Code

Follow for update