Our new X account is live! Follow @wizwand_team for updates
WorkDL logo mark

Gradient Episodic Memory for Continual Learning

About

One major obstacle towards AI is the poor ability of models to solve new problems quicker, and without forgetting previously acquired knowledge. To better understand this issue, we study the problem of continual learning, where the model observes, once and one by one, examples concerning a sequence of tasks. First, we propose a set of metrics to evaluate models learning over a continuum of data. These metrics characterize models not only by their test accuracy, but also in terms of their ability to transfer knowledge across tasks. Second, we propose a model for continual learning, called Gradient Episodic Memory (GEM) that alleviates forgetting, while allowing beneficial transfer of knowledge to previous tasks. Our experiments on variants of the MNIST and CIFAR-100 datasets demonstrate the strong performance of GEM when compared to the state-of-the-art.

David Lopez-Paz, Marc'Aurelio Ranzato• 2017

Related benchmarks

TaskDatasetResultRank
Image ClassificationCIFAR-100--
622
Image ClassificationCIFAR-10--
471
Continual LearningSequential MNIST
Avg Acc99.44
149
Node ClassificationReddit (test)--
134
Node ClassificationACM--
104
Continual LearningCIFAR100 Split
Average Per-Task Accuracy88.95
85
Image ClassificationCIFAR-100 Split
Accuracy61.9
61
Class-incremental learningCIFAR10 (test)
Average Accuracy37.51
59
Node ClassificationDBLP
F1-AP80.04
45
Node ClassificationKindle
F1 (AP)76.46
45
Showing 10 of 110 rows
...

Other info

Follow for update