Our new X account is live! Follow @wizwand_team for updates
WorkDL logo mark

LoRA Subtraction for Drift-Resistant Space in Exemplar-Free Continual Learning

About

In continual learning (CL), catastrophic forgetting often arises due to feature drift. This challenge is particularly prominent in the exemplar-free continual learning (EFCL) setting, where samples from previous tasks cannot be retained, making it difficult to preserve prior knowledge. To address this issue, some EFCL methods aim to identify feature spaces that minimize the impact on previous tasks while accommodating new ones. However, they rely on static features or outdated statistics stored from old tasks, which prevents them from capturing the dynamic evolution of the feature space in CL, leading to performance degradation over time. In this paper, we introduce the Drift-Resistant Space (DRS), which effectively handles feature drifts without requiring explicit feature modeling or the storage of previous tasks. A novel parameter-efficient fine-tuning approach called Low-Rank Adaptation Subtraction (LoRA-) is proposed to develop the DRS. This method subtracts the LoRA weights of old tasks from the initial pre-trained weight before processing new task data to establish the DRS for model training. Therefore, LoRA- enhances stability, improves efficiency, and simplifies implementation. Furthermore, stabilizing feature drifts allows for better plasticity by learning with a triplet loss. Our method consistently achieves state-of-the-art results, especially for long task sequences, across multiple datasets.

Xuan Liu, Xiaobin Chang• 2025

Related benchmarks

TaskDatasetResultRank
Audio ClassificationESC-50 (test)
Accuracy57.5
84
Audio ClassificationUS8K (test)
R@1 Accuracy0.5781
41
Audio ClassificationSpeech Commands V2 (test)
Accuracy34.24
35
Audio ClassificationVocalSet (test)
Top-1 Accuracy24.01
18
Audio ClassificationTIMIT-2 (test)
Top-1 Accuracy0.00e+0
18
Audio ClassificationTIMIT 3 (test)
Average Top-1 Acc0.00e+0
18
Showing 6 of 6 rows

Other info

Follow for update