Share your thoughts, 1 month free Claude Pro on usSee more
WorkDL logo mark

Continual Learning Through Synaptic Intelligence

About

While deep learning has led to remarkable advances across diverse applications, it struggles in domains where the data distribution changes over the course of learning. In stark contrast, biological neural networks continually adapt to changing domains, possibly by leveraging complex molecular machinery to solve many tasks simultaneously. In this study, we introduce intelligent synapses that bring some of this biological complexity into artificial neural networks. Each synapse accumulates task relevant information over time, and exploits this information to rapidly store new memories without forgetting old ones. We evaluate our approach on continual learning of classification tasks, and show that it dramatically reduces forgetting while maintaining computational efficiency.

Friedemann Zenke, Ben Poole, Surya Ganguli• 2017

Related benchmarks

TaskDatasetResultRank
RecommendationGowalla--
153
Continual LearningSequential MNIST
Avg Acc96
149
Class-incremental learningCIFAR-100 10 (test)--
105
Continual LearningCIFAR100 Split
Average Per-Task Accuracy50.4
85
Image Classificationpermuted MNIST (pMNIST) (test)
Accuracy95.33
69
Exemplar-Free Class-Incremental LearningCIFAR-100
Avg Top-1 Inc Acc37.68
68
Exemplar-Free Class-Incremental LearningTinyImageNet
Top-1 Acc (Inc)27.02
62
Image ClassificationCIFAR-100 Split
Accuracy74.84
61
Class-incremental learningCIFAR10 (test)
Average Accuracy27.43
59
Continual LearningTiny ImageNet Split
Forgetting Rate49.7
57
Showing 10 of 159 rows
...

Other info

Follow for update