Our new X account is live! Follow @wizwand_team for updates
WorkDL logo mark

Dynamical Adapter Fusion: Constructing A Global Adapter for Pre-Trained Model-based Class-Incremental Learning

About

Class-Incremental Learning (CIL) requires models to continuously acquire new classes without forgetting previously learned ones. A dominant paradigm involves freezing a pre-trained model and training lightweight, task-specific adapters. However, maintaining task-specific parameters hinders knowledge transfer and incurs high retrieval costs, while naive parameter fusion often leads to destructive interference and catastrophic forgetting. To address these challenges, we propose Dynamical Adapter Fusion (DAF) to construct a single robust global adapter. Grounded in the PAC-Bayes theorem, we derive a fusion mechanism that explicitly integrates three components: the optimized task-specific adapter parameters, the previous global adapter parameters, and the initialization parameters. We utilize the Taylor expansion of the loss function to derive the optimal fusion coefficients, dynamically achieving the best balance between stability and plasticity. Furthermore, we propose a Robust Initialization strategy to effectively capture global knowledge patterns. Experiments on multiple CIL benchmarks demonstrate that DAF achieves state-of-the-art (SOTA) performance.

Ruiqi Liu, Boyu Diao, Zijia An, Zhulin An, Fei Wang, Yongjun Xu• 2026

Related benchmarks

TaskDatasetResultRank
Class-incremental learningCIFAR-100
Averaged Incremental Accuracy91.15
234
Class-incremental learningImageNet-R
Average Accuracy84.01
103
Class-incremental learningImageNet A
Average Accuracy72.06
86
Class-incremental learningObjectNet
Average Accuracy76.11
40
Showing 4 of 4 rows

Other info

Follow for update