Share your thoughts, 1 month free Claude Pro on usSee more
WorkDL logo mark

Improving Continual Learning Performance and Efficiency with Auxiliary Classifiers

About

Continual learning is crucial for applying machine learning in challenging, dynamic, and often resource-constrained environments. However, catastrophic forgetting - overwriting previously learned knowledge when new information is acquired - remains a major challenge. In this work, we examine the intermediate representations in neural network layers during continual learning and find that such representations are less prone to forgetting, highlighting their potential to accelerate computation. Motivated by these findings, we propose to use auxiliary classifiers(ACs) to enhance performance and demonstrate that integrating ACs into various continual learning methods consistently improves accuracy across diverse evaluation settings, yielding an average 10% relative gain. We also leverage the ACs to reduce the average cost of the inference by 10-60% without compromising accuracy, enabling the model to return the predictions before computing all the layers. Our approach provides a scalable and efficient solution for continual learning.

Filip Szatkowski, Yaoyue Zheng, Fei Yang, Bart{\l}omiej Twardowski, Tomasz Trzci\'nski, Joost van de Weijer• 2024

Related benchmarks

TaskDatasetResultRank
Class-incremental learningCIFAR-100
Average Accuracy54.2
116
Class-incremental learningCIFAR100 (test)--
116
Online Class-Incremental LearningTiny-ImageNet
Average Accuracy26.5
60
Online Class-Incremental LearningCIFAR-10
Average Accuracy75.8
30
Online Class-Incremental LearningCIFAR-100
Average Accuracy54.2
30
Online Class-Incremental LearningCIFAR-10 (test)
Average Forgetting12.8
30
Online Class-Incremental LearningTiny ImageNet (test)
Average Forgetting20.1
30
Class-incremental learningMNIST
Average Accuracy97.2
8
Showing 8 of 8 rows

Other info

Follow for update