Our new X account is live! Follow @wizwand_team for updates
WorkDL logo mark

Supervised Contrastive Replay: Revisiting the Nearest Class Mean Classifier in Online Class-Incremental Continual Learning

About

Online class-incremental continual learning (CL) studies the problem of learning new classes continually from an online non-stationary data stream, intending to adapt to new data while mitigating catastrophic forgetting. While memory replay has shown promising results, the recency bias in online learning caused by the commonly used Softmax classifier remains an unsolved challenge. Although the Nearest-Class-Mean (NCM) classifier is significantly undervalued in the CL community, we demonstrate that it is a simple yet effective substitute for the Softmax classifier. It addresses the recency bias and avoids structural changes in the fully-connected layer for new classes. Moreover, we observe considerable and consistent performance gains when replacing the Softmax classifier with the NCM classifier for several state-of-the-art replay methods. To leverage the NCM classifier more effectively, data embeddings belonging to the same class should be clustered and well-separated from those with a different class label. To this end, we contribute Supervised Contrastive Replay (SCR), which explicitly encourages samples from the same class to cluster tightly in embedding space while pushing those of different classes further apart during replay-based training. Overall, we observe that our proposed SCR substantially reduces catastrophic forgetting and outperforms state-of-the-art CL methods by a significant margin on a variety of datasets.

Zheda Mai, Ruiwen Li, Hyunwoo Kim, Scott Sanner• 2021

Related benchmarks

TaskDatasetResultRank
Continual LearningSplit CIFAR10 32x32 (test)
Accuracy59.8
66
Continual LearningMiniImageNet Split 84x84 (test)
Accuracy18.6
66
Continual LearningCIFAR100 Split 32x32 (test)
Accuracy19.3
66
Continual LearningTiny-ImageNet Split 100 tasks (test)
AF (%)14.9
60
Continual LearningSplit CIFAR-100 10 tasks
Accuracy36.5
60
Continual LearningSplit CIFAR-100 (10 tasks) (test)
Accuracy17.5
60
Continual LearningCIFAR-100
Accuracy91.9
56
Continual LearningCIFAR100 10 tasks (test)
Average Forgetting Rate5.6
51
Continual LearningTinyImageNet 100 tasks (test)
Average Forgetting Rate14.9
51
Image ClassificationCIFAR10 5 tasks (test)
Accuracy64.1
51
Showing 10 of 28 rows

Other info

Follow for update