Our new X account is live! Follow @wizwand_team for updates
WorkDL logo mark

Essentials for Class Incremental Learning

About

Contemporary neural networks are limited in their ability to learn from evolving streams of training data. When trained sequentially on new or evolving tasks, their accuracy drops sharply, making them unsuitable for many real-world applications. In this work, we shed light on the causes of this well-known yet unsolved phenomenon - often referred to as catastrophic forgetting - in a class-incremental setup. We show that a combination of simple components and a loss that balances intra-task and inter-task learning can already resolve forgetting to the same extent as more complex measures proposed in literature. Moreover, we identify poor quality of the learned representation as another reason for catastrophic forgetting in class-IL. We show that performance is correlated with secondary class information (dark knowledge) learned by the model and it can be improved by an appropriate regularizer. With these lessons learned, class-incremental learning results on CIFAR-100 and ImageNet improve over the state-of-the-art by a large margin, while keeping the approach simple.

Sudhanshu Mittal, Silvio Galesso, Thomas Brox• 2021

Related benchmarks

TaskDatasetResultRank
Continual LearningTiny-ImageNet Split 100 tasks (test)
AF (%)48.9
60
Continual LearningSplit CIFAR-100 10 tasks
Accuracy20.5
60
Continual LearningSplit CIFAR-100 (10 tasks) (test)
Accuracy17.5
60
Class-incremental learningCIFAR-100--
60
Continual LearningCIFAR10 5 tasks (test)
Avg Forgetting Rate12.5
51
Image ClassificationCIFAR10 5 tasks (test)
Accuracy59.8
51
Image ClassificationMNIST 5 tasks (test)
Accuracy94
51
Continual LearningCIFAR100 10 tasks (test)
Average Forgetting Rate16.1
51
Image ClassificationTinyImageNet 100 tasks (test)
Accuracy15.2
51
Continual LearningMNIST 5 tasks (test)
Average Forgetting Rate4.8
51
Showing 10 of 18 rows

Other info

Code

Follow for update