Our new X account is live! Follow @wizwand_team for updates
WorkDL logo mark

Variational Continual Learning

About

This paper develops variational continual learning (VCL), a simple but general framework for continual learning that fuses online variational inference (VI) and recent advances in Monte Carlo VI for neural networks. The framework can successfully train both deep discriminative models and deep generative models in complex continual learning settings where existing tasks evolve over time and entirely new tasks emerge. Experimental results show that VCL outperforms state-of-the-art continual learning methods on a variety of tasks, avoiding catastrophic forgetting in a fully automatic way.

Cuong V. Nguyen, Yingzhen Li, Thang D. Bui, Richard E. Turner• 2017

Related benchmarks

TaskDatasetResultRank
Image ClassificationSplit MNIST
Average Accuracy96.45
49
Continual LearningPermuted MNIST
Mean Test Accuracy95.5
44
Continual LearningSplit MNIST
Mean Test Accuracy98.4
19
Continual LearningPMNIST (test)
Accuracy90
17
Continual LearningSequential Omniglot (S-OMNIGLOT) (test)
Accuracy53.86
12
Continual LearningCW10 (sequence)
Performance55
11
Continual Reinforcement LearningCW20 sequence
Performance50
11
Image ClassificationSplitCIFAR-10 (test)
Accuracy (TInfer Final)0.1597
7
Showing 8 of 8 rows

Other info

Follow for update