Share your thoughts, 1 month free Claude Pro on usSee more
WorkDL logo mark

Grow, Assess, Compress: Adaptive Backbone Scaling for Memory-Efficient Class Incremental Learning

About

Class Incremental Learning (CIL) poses a fundamental challenge: maintaining a balance between the plasticity required to learn new tasks and the stability needed to prevent catastrophic forgetting. While expansion-based methods effectively mitigate forgetting by adding task-specific parameters, they suffer from uncontrolled architectural growth and memory overhead. In this paper, we propose a novel dynamic scaling framework that adaptively manages model capacity through a cyclic "GRow, Assess, ComprEss" (GRACE) strategy. Crucially, we supplement backbone expansion with a novel saturation assessment phase that evaluates the utilization of the model's capacity. This assessment allows the framework to make informed decisions to either expand the architecture or compress the backbones into a streamlined representation, preventing parameter explosion. Experimental results demonstrate that our approach achieves state-of-the-art performance across multiple CIL benchmarks, while reducing memory footprint by up to a 73% compared to purely expansionist models.

Adrian Garcia-Casta\~neda, Jon Irureta, Jon Imaz, Aizea Lojo• 2026

Related benchmarks

TaskDatasetResultRank
Class-incremental learningCIFAR100 (test)
Avg Acc71.44
116
Class-incremental learningImageNet-100 (Base 0 Inc 10) 1.0 (test)
Last Accuracy66.76
10
Class-incremental learningImageNet-100 (Base 0 Inc 5) 1.0 (test)
Last Accuracy64.86
9
Class-incremental learningImageNet-100 Base 50 Inc 10 1.0 (test)
Last Accuracy73.5
9
Class-incremental learningCIFAR-100 AutoAugment Base 50 Inc 10
Last Accuracy69.42
4
Class-incremental learningCIFAR-100 AutoAugment (Base 0 Inc 10)
Last Accuracy67.98
4
Showing 6 of 6 rows

Other info

Follow for update