Our new X account is live! Follow @wizwand_team for updates
WorkDL logo mark

Influence-Balanced Loss for Imbalanced Visual Classification

About

In this paper, we propose a balancing training method to address problems in imbalanced data learning. To this end, we derive a new loss used in the balancing training phase that alleviates the influence of samples that cause an overfitted decision boundary. The proposed loss efficiently improves the performance of any type of imbalance learning methods. In experiments on multiple benchmark data sets, we demonstrate the validity of our method and reveal that the proposed loss outperforms the state-of-the-art cost-sensitive loss methods. Furthermore, since our loss is not restricted to a specific task, model, or training method, it can be easily used in combination with other recent re-sampling, meta-learning, and cost-sensitive learning methods for class-imbalance problems.

Seulki Park, Jongin Lim, Younghan Jeon, Jin Young Choi• 2021

Related benchmarks

TaskDatasetResultRank
Image ClassificationiNaturalist 2018 (test)--
192
Image ClassificationCIFAR-10-LT (test)
Top-1 Error0.1432
185
Image ClassificationiNaturalist 2018 (val)
Top-1 Accuracy65.39
116
Long-tailed Visual RecognitionImageNet LT
Overall Accuracy56.2
89
Image ClassificationCIFAR-100-LT Imbalance Ratio 100
Top-1 Acc0.45
88
Image ClassificationCIFAR-100-LT Imbalance Ratio 10
Top-1 Acc58
83
Image ClassificationCIFAR-LT-100 (test)
Top-1 Error45.47
74
Image ClassificationCIFAR-100 Imbalance Ratio LT-50 (test)
Accuracy47.4
62
Image ClassificationCIFAR-100-LT Imbalance Ratio 100 (test)
Accuracy42
62
Image ClassificationCIFAR-100-LT (Imbalance Ratio 50)
Top-1 Accuracy48.9
61
Showing 10 of 22 rows

Other info

Code

Follow for update