Share your thoughts, 1 month free Claude Pro on usSee more
WorkDL logo mark

Efficient Lifelong Learning with A-GEM

About

In lifelong learning, the learner is presented with a sequence of tasks, incrementally building a data-driven prior which may be leveraged to speed up learning of a new task. In this work, we investigate the efficiency of current lifelong approaches, in terms of sample complexity, computational and memory cost. Towards this end, we first introduce a new and a more realistic evaluation protocol, whereby learners observe each example only once and hyper-parameter selection is done on a small and disjoint set of tasks, which is not used for the actual learning experience and evaluation. Second, we introduce a new metric measuring how quickly a learner acquires a new skill. Third, we propose an improved version of GEM (Lopez-Paz & Ranzato, 2017), dubbed Averaged GEM (A-GEM), which enjoys the same or even better performance as GEM, while being almost as computationally and memory efficient as EWC (Kirkpatrick et al., 2016) and other regularization-based methods. Finally, we show that all algorithms including A-GEM can learn even more quickly if they are provided with task descriptors specifying the classification tasks under consideration. Our experiments on several standard lifelong learning benchmarks demonstrate that A-GEM has the best trade-off between accuracy and efficiency.

Arslan Chaudhry, Marc'Aurelio Ranzato, Marcus Rohrbach, Mohamed Elhoseiny• 2018

Related benchmarks

TaskDatasetResultRank
Language UnderstandingMMLU
Accuracy6.46
825
ReasoningBBH--
672
Physical Commonsense ReasoningPIQA
Accuracy53.92
572
Generalized Zero-Shot LearningCUB--
307
Generalized Zero-Shot LearningSUN--
229
Generalized Zero-Shot LearningAWA2--
217
Continual LearningSequential MNIST
Avg Acc98.93
149
Text Classification20News
Accuracy93.31
127
Class-incremental learningImageNet-R--
112
Continual LearningCIFAR100 Split
Average Per-Task Accuracy62.3
85
Showing 10 of 160 rows
...

Other info

Code

Follow for update