Share your thoughts, 1 month free Claude Pro on usSee more
WorkDL logo mark

Don't Look Back in Anger: MAGIC Net for Streaming Continual Learning with Temporal Dependence

About

Concept drift, temporal dependence, and catastrophic forgetting represent major challenges when learning from data streams. While Streaming Machine Learning and Continual Learning (CL) address these issues separately, recent efforts in Streaming Continual Learning (SCL) aim to unify them. In this work, we introduce MAGIC Net, a novel SCL approach that integrates CL-inspired architectural strategies with recurrent neural networks to tame temporal dependence. MAGIC Net continuously learns, looks back at past knowledge by applying learnable masks over frozen weights, and expands its architecture when necessary. It performs all operations online, ensuring inference availability at all times. Experiments on synthetic and real-world streams show that it improves adaptation to new concepts, limits memory usage, and mitigates forgetting.

Federico Giannini, Sandro D'Andrea, Emanuele Della Valle• 2026

Related benchmarks

TaskDatasetResultRank
Continual LearningAirQuality
Average Accuracy44
15
Continual LearningPowerConsumption
Average Accuracy61
15
Continual LearningWeather
AVG53
15
Prequential classificationAirQuality
Cohen's Kappa (Start)0.43
15
Prequential classificationPowerConsumption
Cohen's Kappa (Start)0.68
15
Prequential classificationWeather
Cohen's Kappa (start)0.56
15
Continual LearningSRW
Average Performance72
15
Prequential classificationSRW
Cohen's Kappa (Start)0.71
15
Showing 8 of 8 rows

Other info

Follow for update