Our new X account is live! Follow @wizwand_team for updates
WorkDL logo mark

Prototypical Contrastive Learning of Unsupervised Representations

About

This paper presents Prototypical Contrastive Learning (PCL), an unsupervised representation learning method that addresses the fundamental limitations of instance-wise contrastive learning. PCL not only learns low-level features for the task of instance discrimination, but more importantly, it implicitly encodes semantic structures of the data into the learned embedding space. Specifically, we introduce prototypes as latent variables to help find the maximum-likelihood estimation of the network parameters in an Expectation-Maximization framework. We iteratively perform E-step as finding the distribution of prototypes via clustering and M-step as optimizing the network via contrastive learning. We propose ProtoNCE loss, a generalized version of the InfoNCE loss for contrastive learning, which encourages representations to be closer to their assigned prototypes. PCL outperforms state-of-the-art instance-wise contrastive learning methods on multiple benchmarks with substantial improvement in low-resource transfer learning. Code and pretrained models are available at https://github.com/salesforce/PCL.

Junnan Li, Pan Zhou, Caiming Xiong, Steven C.H. Hoi• 2020

Related benchmarks

TaskDatasetResultRank
Object DetectionCOCO 2017 (val)--
2454
Semantic segmentationPASCAL VOC 2012 (val)
Mean IoU69.6
2040
Image ClassificationImageNet-1k (val)
Top-1 Accuracy67.6
1453
Image ClassificationImageNet (val)
Top-1 Acc65.9
1206
ClassificationImageNet-1K 1.0 (val)
Top-1 Accuracy (%)67.6
1155
Instance SegmentationCOCO 2017 (val)--
1144
Image ClassificationImageNet-1k (val)
Top-1 Accuracy61.5
840
Image ClassificationImageNet-1k (val)
Top-1 Acc67.6
706
Object DetectionCOCO (val)
mAP37.8
613
Image ClassificationImageNet-1K
Top-1 Acc67.6
524
Showing 10 of 99 rows
...

Other info

Code

Follow for update