Our new X account is live! Follow @wizwand_team for updates
WorkDL logo mark

ART: Adaptive Resampling-based Training for Imbalanced Classification

About

Traditional resampling methods for handling class imbalance typically uses fixed distributions, undersampling the majority or oversampling the minority. These static strategies ignore changes in class-wise learning difficulty, which can limit the overall performance of the model. This paper proposes an Adaptive Resampling-based Training (ART) method that periodically updates the distribution of the training data based on the class-wise performance of the model. Specifically, ART uses class-wise macro F1 scores, computed at fixed intervals, to determine the degree of resampling to be performed. Unlike instance-level difficulty modeling, which is noisy and outlier-sensitive, ART adapts at the class level. This allows the model to incrementally shift its attention towards underperforming classes in a way that better aligns with the optimization objective. Results on diverse benchmarks, including Pima Indians Diabetes and Yeast dataset demonstrate that ART consistently outperforms both resampling-based and algorithm-level methods, including Synthetic Minority Oversampling Technique (SMOTE), NearMiss Undersampling, and Cost-sensitive Learning on binary as well as multi-class classification tasks with varying degrees of imbalance. In most settings, these improvements are statistically significant. On tabular datasets, gains are significant under paired t-tests and Wilcoxon tests (p < 0.05), while results on text and image tasks remain favorable. Compared to training on the original imbalanced data, ART improves macro F1 by an average of 2.64 percentage points across all tested tabular datasets. Unlike existing methods, whose performance varies by task, ART consistently delivers the strongest macro F1, making it a reliable choice for imbalanced classification.

Arjun Basandrai, Shourya Jain, K. Ilanthenral• 2025

Related benchmarks

TaskDatasetResultRank
Classificationpima
Accuracy77.58
17
Classificationpima
Recall77.03
12
ClassificationRed-Wine
Recall37.83
12
Imbalanced ClassificationPima (test)
Macro F176.31
12
Imbalanced ClassificationYeast (test)
Macro F153.06
12
Imbalanced ClassificationRed Wine (test)
Macro F135.06
12
ClassificationYeast
Recall54.58
12
ClassificationYeast
Accuracy59.14
11
Classificationpima
Precision76.73
11
ClassificationYeast
Precision56.44
11
Showing 10 of 20 rows

Other info

Follow for update