Our new X account is live! Follow @wizwand_team for updates
WorkDL logo mark

Multi-Similarity Contrastive Learning

About

Given a similarity metric, contrastive methods learn a representation in which examples that are similar are pushed together and examples that are dissimilar are pulled apart. Contrastive learning techniques have been utilized extensively to learn representations for tasks ranging from image classification to caption generation. However, existing contrastive learning approaches can fail to generalize because they do not take into account the possibility of different similarity relations. In this paper, we propose a novel multi-similarity contrastive loss (MSCon), that learns generalizable embeddings by jointly utilizing supervision from multiple metrics of similarity. Our method automatically learns contrastive similarity weightings based on the uncertainty in the corresponding similarity, down-weighting uncertain tasks and leading to better out-of-domain generalization to new tasks. We show empirically that networks trained with MSCon outperform state-of-the-art baselines on in-domain and out-of-domain settings.

Emily Mu, John Guttag, Maggie Makar• 2023

Related benchmarks

TaskDatasetResultRank
Image ClassificationMEDIC (In-Domain)
Top-1 Accuracy (Damage Severity)81
12
Category ClassificationZappos50k in-domain (test)
Top-1 Acc97.17
10
Closure ClassificationZappos50k in-domain (test)
Top-1 Acc94.37
10
Gender ClassificationZappos50k in-domain (test)
Top-1 Accuracy85.98
10
Brand ClassificationZappos50k OOD Evaluation
Top-1 Accuracy42.62
2
ClassificationMEDIC OOD Evaluation
Top-1 Accuracy (DS)80.98
2
Showing 6 of 6 rows

Other info

Follow for update