Our new X account is live! Follow @wizwand_team for updates
WorkDL logo mark

Task Agnostic Continual Learning Using Online Variational Bayes

About

Catastrophic forgetting is the notorious vulnerability of neural networks to the change of the data distribution while learning. This phenomenon has long been considered a major obstacle for allowing the use of learning agents in realistic continual learning settings. A large body of continual learning research assumes that task boundaries are known during training. However, research for scenarios in which task boundaries are unknown during training has been lacking. In this paper we present, for the first time, a method for preventing catastrophic forgetting (BGD) for scenarios with task boundaries that are unknown during training --- task-agnostic continual learning. Code of our algorithm is available at https://github.com/igolan/bgd.

Chen Zeno, Itay Golan, Elad Hoffer, Daniel Soudry• 2018

Related benchmarks

TaskDatasetResultRank
Online Continual LearningCIFAR-100 1 (test)
Accuracy110
20
Online Continual LearningMNIST 10/1 (test)
Accuracy10.9
20
Online Continual LearningCIFAR-10 10/1 (test)
Accuracy10
20
Online Continual LearningminiImageNet 100/1 (test)
Accuracy1
19
Showing 4 of 4 rows

Other info

Follow for update