Our new X account is live! Follow @wizwand_team for updates
WorkDL logo mark

Rebalanced Siamese Contrastive Mining for Long-Tailed Recognition

About

Deep neural networks perform poorly on heavily class-imbalanced datasets. Given the promising performance of contrastive learning, we propose Rebalanced Siamese Contrastive Mining (ResCom) to tackle imbalanced recognition. Based on the mathematical analysis and simulation results, we claim that supervised contrastive learning suffers a dual class-imbalance problem at both the original batch and Siamese batch levels, which is more serious than long-tailed classification learning. In this paper, at the original batch level, we introduce a class-balanced supervised contrastive loss to assign adaptive weights for different classes. At the Siamese batch level, we present a class-balanced queue, which maintains the same number of keys for all classes. Furthermore, we note that the imbalanced contrastive loss gradient with respect to the contrastive logits can be decoupled into the positives and negatives, and easy positives and easy negatives will make the contrastive gradient vanish. We propose supervised hard positive and negative pairs mining to pick up informative pairs for contrastive computation and improve representation learning. Finally, to approximately maximize the mutual information between the two views, we propose Siamese Balanced Softmax and joint it with the contrastive loss for one-stage training. Extensive experiments demonstrate that ResCom outperforms the previous methods by large margins on multiple long-tailed recognition benchmarks. Our code and models are made publicly available at: https://github.com/dvlab-research/ResCom.

Zhisheng Zhong, Jiequan Cui, Zeming Li, Eric Lo, Jian Sun, Jiaya Jia• 2022

Related benchmarks

TaskDatasetResultRank
Image ClassificationImageNet-LT (test)
Top-1 Acc (All)59.2
159
Image ClassificationPlaces-LT (test)
Accuracy (Medium)43.4
128
Image ClassificationImageNet-C (test)
mCE (Mean Corruption Error)69.4
110
Image ClassificationCIFAR-100-LT Imbalance Factor 100 (test)
Top-1 Accuracy53.8
44
Image ClassificationCIFAR-LT-100 Imbalance Factor 50 (test)
Top-1 Accuracy58
42
Image ClassificationCIFAR-LT-100 Imbalance Factor 10 (test)
Top-1 Accuracy66.1
41
Image ClassificationiNaturalist 2018 (natural world distribution)
Acc (Total)0.752
39
Image ClassificationCIFAR-10-LT IF 100
Top-1 Accuracy84.9
36
Image ClassificationCIFAR-10-LT (IF 50)
Top-1 Accuracy88
35
Image ClassificationCIFAR-10-LT IF 10
Top-1 Accuracy92
33
Showing 10 of 10 rows

Other info

Code

Follow for update