Our new X account is live! Follow @wizwand_team for updates
WorkDL logo mark

Learn by Reasoning: Analogical Weight Generation for Few-Shot Class-Incremental Learning

About

Few-shot class-incremental Learning (FSCIL) enables models to learn new classes from limited data while retaining performance on previously learned classes. Traditional FSCIL methods often require fine-tuning parameters with limited new class data and suffer from a separation between learning new classes and utilizing old knowledge. Inspired by the analogical learning mechanisms of the human brain, we propose a novel analogical generative method. Our approach includes the Brain-Inspired Analogical Generator (BiAG), which derives new class weights from existing classes without parameter fine-tuning during incremental stages. BiAG consists of three components: Weight Self-Attention Module (WSA), Weight & Prototype Analogical Attention Module (WPAA), and Semantic Conversion Module (SCM). SCM uses Neural Collapse theory for semantic conversion, WSA supplements new class weights, and WPAA computes analogies to generate new class weights. Experiments on miniImageNet, CUB-200, and CIFAR-100 datasets demonstrate that our method achieves higher final and average accuracy compared to SOTA methods.

Jizhou Han, Chenhao Ding, Yuhang He, Songlin Dong, Qiang Wang, Xinyuan Gao, Yihong Gong• 2025

Related benchmarks

TaskDatasetResultRank
Few-Shot Class-Incremental LearningCIFAR100 (test)
Session 4 Top-1 Acc67.36
122
Few-Shot Class-Incremental LearningCIFAR100--
67
Few-Shot Class-Incremental LearningMiniImagenet
Avg Accuracy86.43
31
Few-Shot Class-Incremental LearningminiImageNet 5-way 5-shot (incremental)
Accuracy S084.78
21
Few-Shot Class-Incremental LearningCUB-200
Session 0 Accuracy82.97
21
Few-Shot Class-Incremental LearningminiImageNet, CUB-200, and CIFAR-100 Average (test)
Base Accuracy83.92
18
Few-Shot Class-Incremental LearningImageNet 1k (test)
Base Accuracy37.98
3
Showing 7 of 7 rows

Other info

Follow for update