Our new X account is live! Follow @wizwand_team for updates
WorkDL logo mark

Memorizing Complementation Network for Few-Shot Class-Incremental Learning

About

Few-shot Class-Incremental Learning (FSCIL) aims at learning new concepts continually with only a few samples, which is prone to suffer the catastrophic forgetting and overfitting problems. The inaccessibility of old classes and the scarcity of the novel samples make it formidable to realize the trade-off between retaining old knowledge and learning novel concepts. Inspired by that different models memorize different knowledge when learning novel concepts, we propose a Memorizing Complementation Network (MCNet) to ensemble multiple models that complements the different memorized knowledge with each other in novel tasks. Additionally, to update the model with few novel samples, we develop a Prototype Smoothing Hard-mining Triplet (PSHT) loss to push the novel samples away from not only each other in current task but also the old distribution. Extensive experiments on three benchmark datasets, e.g., CIFAR100, miniImageNet and CUB200, have demonstrated the superiority of our proposed method.

Zhong Ji, Zhishen Hou, Xiyao Liu, Yanwei Pang, Xuelong Li• 2022

Related benchmarks

TaskDatasetResultRank
Few-Shot Class-Incremental LearningminiImageNet (test)
Accuracy (Session 1)67.7
173
Few-Shot Class-Incremental LearningCIFAR100 (test)
Session 4 Top-1 Acc58.75
122
Few-Shot Class-Incremental LearningCUB200 (test)
Accuracy (Session 1)73.96
92
Few-Shot Class-Incremental LearningCUB-200
Session 1 Accuracy73.96
75
Few-Shot Class-Incremental LearningCIFAR100
Accuracy (S0)73.3
67
Few-Shot Class-Incremental LearningCUB200 (incremental sessions)
Session 0 Accuracy77.57
37
Few-Shot Class-Incremental LearningMiniImagenet
Avg Accuracy58.64
31
Showing 7 of 7 rows

Other info

Follow for update