Share your thoughts, 1 month free Claude Pro on usSee more
WorkDL logo mark

Elastic Weight Consolidation Done Right for Continual Learning

About

Weight regularization methods in continual learning (CL) alleviate catastrophic forgetting by assessing and penalizing changes to important model weights. Elastic Weight Consolidation (EWC) is a foundational and widely used approach within this framework that estimates weight importance based on gradients. However, it has consistently shown suboptimal performance. In this paper, we conduct a systematic analysis of importance estimation in EWC from a gradient-based perspective. For the first time, we find that EWC's reliance on the Fisher Information Matrix (FIM) results in gradient vanishing and inaccurate importance estimation in certain scenarios. Our analysis also reveals that Memory Aware Synapses (MAS), a variant of EWC, imposes unnecessary constraints on parameters irrelevant to prior tasks, termed the redundant protection. Consequently, both EWC and its variants exhibit fundamental misalignments in estimating weight importance, leading to inferior performance. To tackle these issues, we propose the Logits Reversal (LR) operation, a simple yet effective modification that rectifies EWC's importance estimation. Specifically, reversing the logit values during the calculation of FIM can effectively prevent both gradient vanishing and redundant protection. Extensive experiments across various CL tasks and datasets show that the proposed method significantly outperforms existing EWC and its variants. Therefore, we refer to it as EWC Done Right (EWC-DR). Code is available at https://github.com/scarlet0703/EWC-DR.

Xuan Liu, Xiaobin Chang• 2026

Related benchmarks

TaskDatasetResultRank
Exemplar-Free Class-Incremental LearningCIFAR-100
Avg Top-1 Inc Acc63.75
68
Exemplar-Free Class-Incremental LearningTinyImageNet
Top-1 Acc (Inc)47
62
Exemplar-Free Class-Incremental LearningCIFAR-100 Big start
Average Incremental Accuracy (Aavg)63.9
39
Natural Language Visual ReasoningNLVR2
Accuracy72.77
21
Exemplar-Free Class-Incremental LearningCIFAR-100 Equally split
Aavg61.5
15
Visual Question AnsweringVQA v2
Accuracy67.72
8
Visual EntailmentSNLI-VE
Accuracy74.58
4
Showing 7 of 7 rows

Other info

Follow for update