Share your thoughts, 1 month free Claude Pro on usSee more
WorkDL logo mark

Predicting the Susceptibility of Examples to Catastrophic Forgetting

About

Catastrophic forgetting - the tendency of neural networks to forget previously learned data when learning new information - remains a central challenge in continual learning. In this work, we adopt a behavioral approach, observing a connection between learning speed and forgetting: examples learned more quickly are less prone to forgetting. Focusing on replay-based continual learning, we show that the composition of the replay buffer - specifically, whether it contains quickly or slowly learned examples - has a significant effect on forgetting. Motivated by this insight, we introduce Speed-Based Sampling (SBS), a simple yet general strategy that selects replay examples based on their learning speed. SBS integrates easily into existing buffer-based methods and improves performance across a wide range of competitive continual learning benchmarks, advancing state-of-the-art results. Our findings underscore the value of accounting for the forgetting dynamics when designing continual learning algorithms.

Guy Hacohen, Tinne Tuytelaars• 2024

Related benchmarks

TaskDatasetResultRank
Class-incremental learningCIFAR-100
Average Accuracy42.1
116
Class-incremental learningCIFAR100 (test)--
116
Online Class-Incremental LearningTiny-ImageNet
Average Accuracy27.3
60
Online Class-Incremental LearningCIFAR-100
Average Accuracy42.1
30
Online Class-Incremental LearningCIFAR-10
Average Accuracy64.8
30
Online Class-Incremental LearningCIFAR-10 (test)
Average Forgetting27.7
30
Online Class-Incremental LearningTiny ImageNet (test)
Average Forgetting26.8
30
Showing 7 of 7 rows

Other info

Follow for update