Our new X account is live! Follow @wizwand_team for updates
WorkDL logo mark

Continual Learning of a Mixed Sequence of Similar and Dissimilar Tasks

About

Existing research on continual learning of a sequence of tasks focused on dealing with catastrophic forgetting, where the tasks are assumed to be dissimilar and have little shared knowledge. Some work has also been done to transfer previously learned knowledge to the new task when the tasks are similar and have shared knowledge. To the best of our knowledge, no technique has been proposed to learn a sequence of mixed similar and dissimilar tasks that can deal with forgetting and also transfer knowledge forward and backward. This paper proposes such a technique to learn both types of tasks in the same network. For dissimilar tasks, the algorithm focuses on dealing with forgetting, and for similar tasks, the algorithm focuses on selectively transferring the knowledge learned from some similar previous tasks to improve the new task learning. Additionally, the algorithm automatically detects whether a new task is similar to any previous tasks. Empirical evaluation using sequences of mixed tasks demonstrates the effectiveness of the proposed model.

Zixuan Ke, Bing Liu, Xingchang Huang• 2021

Related benchmarks

TaskDatasetResultRank
Text Classification20News
Accuracy95.17
101
Document Sentiment ClassificationDSC full
Accuracy87.34
40
Aspect Sentiment ClassificationASC
Accuracy83.68
40
Document Sentiment ClassificationDSC small
Accuracy67.41
40
Forgetting Rate20News FR
Accuracy24.37
34
Image ClassificationM(EMNIST-10, F-EMNIST) Overall
Accuracy77.1
22
Image ClassificationM(CIFAR100-10, F-CelebA) (Overall)
Accuracy69.09
22
Image ClassificationM(CIFAR100-20, F-CelebA) Overall
Accuracy71.64
22
Image ClassificationM(EMNIST-20, F-EMNIST)
Accuracy95.66
22
Image ClassificationM(CIFAR100-10, F-CelebA)
Accuracy68.31
15
Showing 10 of 13 rows

Other info

Code

Follow for update