Our new X account is live! Follow @wizwand_team for updates
WorkDL logo mark

Is Parameter Isolation Better for Prompt-Based Continual Learning?

About

Prompt-based continual learning methods effectively mitigate catastrophic forgetting. However, most existing methods assign a fixed set of prompts to each task, completely isolating knowledge across tasks and resulting in suboptimal parameter utilization. To address this, we consider the practical needs of continual learning and propose a prompt-sharing framework. This framework constructs a global prompt pool and introduces a task-aware gated routing mechanism that sparsely activates a subset of prompts to achieve dynamic decoupling and collaborative optimization of task-specific feature representations. Furthermore, we introduce a history-aware modulator that leverages cumulative prompt activation statistics to protect frequently used prompts from excessive updates, thereby mitigating inefficient parameter usage and knowledge forgetting. Extensive analysis and empirical results demonstrate that our approach consistently outperforms existing static allocation strategies in effectiveness and efficiency.

Jiangyang Li, Chenhao Ding, Songlin Dong, Qiang Wang, Jianchao Zhao, Yuhang He, Yihong Gong• 2026

Related benchmarks

TaskDatasetResultRank
Image ClassificationCIFAR-100 Split
Accuracy95.02
61
Class-incremental learningSplit ImageNet-R
Average Forgetting Measure2.63
57
Class-incremental learning5-Datasets
FAA95.12
17
Class-incremental learningCUB-200 Split
FAA91.34
17
Continual LearningCORe50
AN Score87.94
14
Class-incremental learningCIFAR-100 Split
FAA95.02
10
Long sequence incremental learningCIFAR-100 Split (test)
FAA94.07
7
Long sequence incremental learningImageNet-R Split (test)
FAA0.7657
7
Image ClassificationCIFAR100 10 tasks (test)
FAA95.02
6
Image ClassificationCUB-200 incremental tasks
FAA91.34
6
Showing 10 of 11 rows

Other info

Follow for update