Share your thoughts, 1 month free Claude Pro on usSee more
WorkDL logo mark

Fly-CL: A Fly-Inspired Framework for Enhancing Efficient Decorrelation and Reduced Training Time in Pre-trained Model-based Continual Representation Learning

About

Using a nearly-frozen pretrained model, the continual representation learning paradigm reframes parameter updates as a similarity-matching problem to mitigate catastrophic forgetting. However, directly leveraging pretrained features for downstream tasks often suffers from multicollinearity in the similarity-matching stage, and more advanced methods can be computationally prohibitive for real-time, low-latency applications. Inspired by the fly olfactory circuit, we propose Fly-CL, a bio-inspired framework compatible with a wide range of pretrained backbones. Fly-CL substantially reduces training time while achieving performance comparable to or exceeding that of current state-of-the-art methods. We theoretically show how Fly-CL progressively resolves multicollinearity, enabling more effective similarity matching with low time complexity. Extensive simulation experiments across diverse network architectures and data regimes validate Fly-CL's effectiveness in addressing this challenge through a biologically inspired design. Code is available at https://github.com/gfyddha/Fly-CL.

Heming Zou, Yunliang Zang, Wutong Xu, Xiangyang Ji• 2025

Related benchmarks

TaskDatasetResultRank
Continual LearningCIFAR-100--
56
Image ClassificationImageNet A--
50
Continual LearningCIFAR-100
Training Time per Task8.31
21
Continual LearningCUB-200 2011
Avg Training Time per Task2.76
21
Continual LearningVTAB
Average Training Time per Task1.7
21
Continual LearningCUB
Backward Transfer (BwT)-3.8
17
Continual LearningVTAB
Average Task Accuracy (AT)94.61
9
Continual LearningCIFAR-100
Memory Usage (GB)6.7
9
Continual LearningVTAB
Memory Usage (GB)4.3
9
Image ClassificationImageNet-R
Tau (Training)7.55
9
Showing 10 of 10 rows

Other info

Follow for update