Share your thoughts, 1 month free Claude Pro on usSee more
WorkDL logo mark

Neural Collapse Terminus: A Unified Solution for Class Incremental Learning and Its Variants

About

How to enable learnability for new classes while keeping the capability well on old classes has been a crucial challenge for class incremental learning. Beyond the normal case, long-tail class incremental learning and few-shot class incremental learning are also proposed to consider the data imbalance and data scarcity, respectively, which are common in real-world implementations and further exacerbate the well-known problem of catastrophic forgetting. Existing methods are specifically proposed for one of the three tasks. In this paper, we offer a unified solution to the misalignment dilemma in the three tasks. Concretely, we propose neural collapse terminus that is a fixed structure with the maximal equiangular inter-class separation for the whole label space. It serves as a consistent target throughout the incremental training to avoid dividing the feature space incrementally. For CIL and LTCIL, we further propose a prototype evolving scheme to drive the backbone features into our neural collapse terminus smoothly. Our method also works for FSCIL with only minor adaptations. Theoretical analysis indicates that our method holds the neural collapse optimality in an incremental fashion regardless of data imbalance or data scarcity. We also design a generalized case where we do not know the total number of classes and whether the data distribution is normal, long-tail, or few-shot for each coming session, to test the generalizability of our method. Extensive experiments with multiple datasets are conducted to demonstrate the effectiveness of our unified solution to all the three tasks and the generalized case.

Yibo Yang, Haobo Yuan, Xiangtai Li, Jianlong Wu, Lefei Zhang, Zhouchen Lin, Philip Torr, Dacheng Tao, Bernard Ghanem• 2023

Related benchmarks

TaskDatasetResultRank
Image ClassificationSeq-CIFAR-100
Accuracy76.06
52
Image ClassificationCIFAR-10 Seq
Final Average Accuracy81.27
52
Image ClassificationSeq-Tiny-ImageNet
Final Average Accuracy62.3
44
Task-Incremental LearningCIFAR-100 Seq
FAA75.87
28
Class-incremental learningCIFAR-10 Seq
Final Average Accuracy (FAA)60.93
28
Task-Incremental LearningSeq-CIFAR-10
FAA81.27
28
Task-Incremental LearningTiny ImageNet Seq
FAA62.3
24
Class-incremental learningTinyImageNet Seq
FAA18.24
24
Showing 8 of 8 rows

Other info

Follow for update