Our new X account is live! Follow @wizwand_team for updates
WorkDL logo mark

Mnemonics Training: Multi-Class Incremental Learning without Forgetting

About

Multi-Class Incremental Learning (MCIL) aims to learn new concepts by incrementally updating a model trained on previous concepts. However, there is an inherent trade-off to effectively learning new concepts without catastrophic forgetting of previous ones. To alleviate this issue, it has been proposed to keep around a few examples of the previous concepts but the effectiveness of this approach heavily depends on the representativeness of these examples. This paper proposes a novel and automatic framework we call mnemonics, where we parameterize exemplars and make them optimizable in an end-to-end manner. We train the framework through bilevel optimizations, i.e., model-level and exemplar-level. We conduct extensive experiments on three MCIL benchmarks, CIFAR-100, ImageNet-Subset and ImageNet, and show that using mnemonics exemplars can surpass the state-of-the-art by a large margin. Interestingly and quite intriguingly, the mnemonics exemplars tend to be on the boundaries between different classes.

Yaoyao Liu, Yuting Su, An-An Liu, Bernt Schiele, Qianru Sun• 2020

Related benchmarks

TaskDatasetResultRank
Class-incremental learningCIFAR100 (test)
Avg Acc63.34
76
Class-incremental learningImageNet-100
Avg Acc75.54
74
Class-incremental learningCIFAR-100
Average Accuracy64.59
60
Incremental LearningImageNet subset
Average Accuracy72.6
58
Incremental LearningCIFAR-100
Average Accuracy63.34
51
Incremental LearningImageNet full
Average Accuracy65.4
48
Class-incremental learningCIFAR-100 (incremental)
Avg Incremental Acc63.34
26
Class-incremental learningImageNet-1K standard (test val)
Average Accuracy64.54
22
Class-incremental learningImageNet-100--
20
Class-incremental learningImageNet1000 (incremental)
Avg Incremental Accuracy63.01
15
Showing 10 of 11 rows

Other info

Follow for update