Share your thoughts, 1 month free Claude Pro on usSee more
WorkDL logo mark

F-OAL: Forward-only Online Analytic Learning with Fast Training and Low Memory Footprint in Class Incremental Learning

About

Online Class Incremental Learning (OCIL) aims to train models incrementally, where data arrive in mini-batches, and previous data are not accessible. A major challenge in OCIL is Catastrophic Forgetting, i.e., the loss of previously learned knowledge. Among existing baselines, replay-based methods show competitive results but requires extra memory for storing exemplars, while exemplar-free (i.e., data need not be stored for replay in production) methods are resource-friendly but often lack accuracy. In this paper, we propose an exemplar-free approach--Forward-only Online Analytic Learning (F-OAL). Unlike traditional methods, F-OAL does not rely on back-propagation and is forward-only, significantly reducing memory usage and computational time. Cooperating with a pre-trained frozen encoder with Feature Fusion, F-OAL only needs to update a linear classifier by recursive least square. This approach simultaneously achieves high accuracy and low resource consumption. Extensive experiments on benchmark datasets demonstrate F-OAL's robust performance in OCIL scenarios. Code is available at https://github.com/liuyuchen-cz/F-OAL.

Huiping Zhuang, Yuchen Liu, Run He, Kai Tong, Ziqian Zeng, Cen Chen, Yi Wang, Lap-Pui Chau• 2024

Related benchmarks

TaskDatasetResultRank
Continual LearningCIFAR-100
Accuracy91.1
56
Image ClassificationImageNet A--
50
Continual LearningCUB-200 2011
Avg Training Time per Task3.65
21
Continual LearningVTAB
Average Training Time per Task2.17
21
Continual LearningCIFAR-100
Training Time per Task16.32
21
Class-incremental learningFGVC Aircraft
Accuracy Last54
21
Continual LearningDTD
Average Performance (Aavg)82.8
18
Continual LearningCUB
Backward Transfer (BwT)-5.43
17
Continual LearningCORe50--
14
Continual LearningTiny-ImageNet--
14
Showing 10 of 15 rows

Other info

Code

Follow for update