Our new X account is live! Follow @wizwand_team for updates
WorkDL logo mark

Overcoming Catastrophic Forgetting with Unlabeled Data in the Wild

About

Lifelong learning with deep neural networks is well-known to suffer from catastrophic forgetting: the performance on previous tasks drastically degrades when learning a new task. To alleviate this effect, we propose to leverage a large stream of unlabeled data easily obtainable in the wild. In particular, we design a novel class-incremental learning scheme with (a) a new distillation loss, termed global distillation, (b) a learning strategy to avoid overfitting to the most recent task, and (c) a confidence-based sampling method to effectively leverage unlabeled external data. Our experimental results on various datasets, including CIFAR and ImageNet, demonstrate the superiority of the proposed methods over prior methods, particularly when a stream of unlabeled data is accessible: our method shows up to 15.8% higher accuracy and 46.5% less forgetting compared to the state-of-the-art method. The code is available at https://github.com/kibok90/iccv2019-inc.

Kibok Lee, Kimin Lee, Jinwoo Shin, Honglak Lee• 2019

Related benchmarks

TaskDatasetResultRank
Image ClassificationCIFAR-100 Split
Accuracy62.1
61
Continual LearningCIFAR-100 (10-split)
ACC71.27
42
Continual LearningTinyImageNet 25-split
ACC42.74
29
Continual LearningSplit CIFAR-100 20 tasks
Mean Test Accuracy77.16
26
Audio-Visual Sound SeparationMUSIC-21 (test)
SDR6.65
24
Showing 5 of 5 rows

Other info

Follow for update