Influence-Balanced Loss for Imbalanced Visual Classification
About
In this paper, we propose a balancing training method to address problems in imbalanced data learning. To this end, we derive a new loss used in the balancing training phase that alleviates the influence of samples that cause an overfitted decision boundary. The proposed loss efficiently improves the performance of any type of imbalance learning methods. In experiments on multiple benchmark data sets, we demonstrate the validity of our method and reveal that the proposed loss outperforms the state-of-the-art cost-sensitive loss methods. Furthermore, since our loss is not restricted to a specific task, model, or training method, it can be easily used in combination with other recent re-sampling, meta-learning, and cost-sensitive learning methods for class-imbalance problems.
Related benchmarks
| Task | Dataset | Result | Rank | |
|---|---|---|---|---|
| Image Classification | iNaturalist 2018 (test) | -- | 192 | |
| Image Classification | CIFAR-10-LT (test) | Top-1 Error0.1432 | 185 | |
| Image Classification | iNaturalist 2018 (val) | Top-1 Accuracy65.39 | 116 | |
| Long-tailed Visual Recognition | ImageNet LT | Overall Accuracy56.2 | 89 | |
| Image Classification | CIFAR-100-LT Imbalance Ratio 100 | Top-1 Acc0.45 | 88 | |
| Image Classification | CIFAR-100-LT Imbalance Ratio 10 | Top-1 Acc58 | 83 | |
| Image Classification | CIFAR-LT-100 (test) | Top-1 Error45.47 | 74 | |
| Image Classification | CIFAR-100 Imbalance Ratio LT-50 (test) | Accuracy47.4 | 62 | |
| Image Classification | CIFAR-100-LT Imbalance Ratio 100 (test) | Accuracy42 | 62 | |
| Image Classification | CIFAR-100-LT (Imbalance Ratio 50) | Top-1 Accuracy48.9 | 61 |