Our new X account is live! Follow @wizwand_team for updates
WorkDL logo mark

DyTox: Transformers for Continual Learning with DYnamic TOken eXpansion

About

Deep network architectures struggle to continually learn new tasks without forgetting the previous tasks. A recent trend indicates that dynamic architectures based on an expansion of the parameters can reduce catastrophic forgetting efficiently in continual learning. However, existing approaches often require a task identifier at test-time, need complex tuning to balance the growing number of parameters, and barely share any information across tasks. As a result, they struggle to scale to a large number of tasks without significant overhead. In this paper, we propose a transformer architecture based on a dedicated encoder/decoder framework. Critically, the encoder and decoder are shared among all tasks. Through a dynamic expansion of special tokens, we specialize each forward of our decoder network on a task distribution. Our strategy scales to a large number of tasks while having negligible memory and time overheads due to strict control of the parameters expansion. Moreover, this efficient strategy doesn't need any hyperparameter tuning to control the network's expansion. Our model reaches excellent results on CIFAR100 and state-of-the-art performances on the large-scale ImageNet100 and ImageNet1000 while having less parameters than concurrent dynamic frameworks.

Arthur Douillard, Alexandre Ram\'e, Guillaume Couairon, Matthieu Cord• 2021

Related benchmarks

TaskDatasetResultRank
Image ClassificationDomainNet (test)--
209
Class-incremental learningCIFAR100 (test)
Avg Acc70.28
76
Class-incremental learningCIFAR-100 10 (test)
Average Top-1 Accuracy77.1
75
Class-incremental learningImageNet-100
Avg Acc77.08
74
Class-incremental learningImageNet-100 B=50, C=10 1.0
Avg Incremental Acc79.8
42
Incremental LearningCIFAR100 10 steps
Final Step Performance62.34
39
Incremental LearningCIFAR100 50 steps
Last Accuracy57.09
36
Domain-incremental learningCORe50 (test)
Test Accuracy79.21
34
Class-incremental learningCIFAR100 B0 (20 steps) (test)
Last Step Top-1 Acc56.32
31
Domain-incremental learningCDDB Hard (test)
Average Accuracy86.21
25
Showing 10 of 41 rows

Other info

Code

Follow for update