Our new X account is live! Follow @wizwand_team for updates
WorkDL logo mark

Long-tail learning via logit adjustment

About

Real-world classification problems typically exhibit an imbalanced or long-tailed label distribution, wherein many labels are associated with only a few samples. This poses a challenge for generalisation on such labels, and also makes na\"ive learning biased towards dominant labels. In this paper, we present two simple modifications of standard softmax cross-entropy training to cope with these challenges. Our techniques revisit the classic idea of logit adjustment based on the label frequencies, either applied post-hoc to a trained model, or enforced in the loss during training. Such adjustment encourages a large relative margin between logits of rare versus dominant labels. These techniques unify and generalise several recent proposals in the literature, while possessing firmer statistical grounding and empirical performance.

Aditya Krishna Menon, Sadeep Jayasumana, Ankit Singh Rawat, Himanshu Jain, Andreas Veit, Sanjiv Kumar• 2020

Related benchmarks

TaskDatasetResultRank
Image ClassificationiNaturalist 2018
Top-1 Accuracy68.4
287
Image ClassificationImageNet LT
Top-1 Accuracy56.5
251
Long-Tailed Image ClassificationImageNet-LT (test)--
220
Image ClassificationCIFAR-10 long-tailed (test)
Top-1 Acc75.3
201
Image ClassificationiNaturalist 2018 (test)
Top-1 Accuracy66.4
192
Text ClassificationSST-2 (test)
Accuracy86.61
185
Image ClassificationImageNet-LT (test)
Top-1 Acc (All)51.1
159
Image ClassificationCIFAR100 long-tailed (test)
Accuracy58.6
155
ClassificationCIFAR100-LT (test)
Accuracy62.4
136
Long-tailed Visual RecognitionImageNet LT
Overall Accuracy52.1
89
Showing 10 of 112 rows
...

Other info

Code

Follow for update