Share your thoughts, 1 month free Claude Pro on usSee more
WorkDL logo mark

Class Incremental Learning with Task-Specific Batch Normalization and Out-of-Distribution Detection

About

This study focuses on incremental learning for image classification, exploring how to reduce catastrophic forgetting of all learned knowledge when access to old data is restricted. The challenge lies in balancing plasticity (learning new knowledge) and stability (retaining old knowledge). Based on whether the task identifier (task-ID) is available during testing, incremental learning is divided into task incremental learning (TIL) and class incremental learning (CIL). The TIL paradigm often uses multiple classifier heads, selecting the corresponding head based on the task-ID. Since the CIL paradigm cannot access task-ID, methods originally developed for TIL require explicit task-ID prediction to bridge this gap and enable their adaptation to the CIL paradigm. {In this study, a novel continual learning framework extends the TIL method for CIL by introducing out-of-distribution detection for task-ID prediction. Our framework utilizes task-specific Batch Normalization (BN) and task-specific classification heads to effectively adjust feature map distributions for each task, enhancing plasticity. With far fewer parameters than convolutional kernels, task-specific BN helps minimize parameter growth, preserving stability. Based on multiple task-specific classification heads, we introduce an ``unknow'' class for each head. During training, data from other tasks are mapped to this unknown class. During inference, the task-ID is predicted by selecting the classification head with the lowest probability assigned to the unknown class. Our method achieves state-of-the-art performance on two medical image datasets and two natural image datasets. The source code is available at https://github.com/z1968357787/mbn_ood_git_main.

Zhiping Zhou, Xuchen Xie, Yiqiao Qiu, Run Lin, Weishi Zheng, Ruixuan Wang• 2024

Related benchmarks

TaskDatasetResultRank
Class-incremental learningCIFAR-100 10T
Avg Accuracy (A_T)80.34
35
Class-incremental learningCIFAR-100 T=20 (test)
Final Accuracy69.81
25
Incremental LearningCIFAR100 T=50
Last Accuracy68.15
19
Class-incremental learningCUB200 (10T)
Last Accuracy42.33
15
Class-incremental learningCUB200 (20T)
Last Accuracy37.21
15
Class-incremental learningPath16 Order I 1.0 (train test)
Last Accuracy73.25
15
Class-incremental learningPath16 Order II 1.0 (train test)
Last Accuracy72.25
15
Class-incremental learningSkin8 Memory size 40 1.0 (train test)
Last Accuracy49.93
15
Class-incremental learningSkin8 Memory size 16 1.0 (train test)
Last Accuracy44.81
15
Incremental LearningCIFAR100 T=10
Last Accuracy69.59
4
Showing 10 of 12 rows

Other info

Follow for update