Is Parameter Isolation Better for Prompt-Based Continual Learning?
About
Prompt-based continual learning methods effectively mitigate catastrophic forgetting. However, most existing methods assign a fixed set of prompts to each task, completely isolating knowledge across tasks and resulting in suboptimal parameter utilization. To address this, we consider the practical needs of continual learning and propose a prompt-sharing framework. This framework constructs a global prompt pool and introduces a task-aware gated routing mechanism that sparsely activates a subset of prompts to achieve dynamic decoupling and collaborative optimization of task-specific feature representations. Furthermore, we introduce a history-aware modulator that leverages cumulative prompt activation statistics to protect frequently used prompts from excessive updates, thereby mitigating inefficient parameter usage and knowledge forgetting. Extensive analysis and empirical results demonstrate that our approach consistently outperforms existing static allocation strategies in effectiveness and efficiency.
Related benchmarks
| Task | Dataset | Result | Rank | |
|---|---|---|---|---|
| Image Classification | CIFAR-100 Split | Accuracy95.02 | 61 | |
| Class-incremental learning | Split ImageNet-R | Average Forgetting Measure2.63 | 57 | |
| Class-incremental learning | 5-Datasets | FAA95.12 | 17 | |
| Class-incremental learning | CUB-200 Split | FAA91.34 | 17 | |
| Continual Learning | CORe50 | AN Score87.94 | 14 | |
| Class-incremental learning | CIFAR-100 Split | FAA95.02 | 10 | |
| Long sequence incremental learning | CIFAR-100 Split (test) | FAA94.07 | 7 | |
| Long sequence incremental learning | ImageNet-R Split (test) | FAA0.7657 | 7 | |
| Image Classification | CIFAR100 10 tasks (test) | FAA95.02 | 6 | |
| Image Classification | CUB-200 incremental tasks | FAA91.34 | 6 |