Our new X account is live! Follow @wizwand_team for updates
WorkDL logo mark

PackNet: Adding Multiple Tasks to a Single Network by Iterative Pruning

About

This paper presents a method for adding multiple tasks to a single deep neural network while avoiding catastrophic forgetting. Inspired by network pruning techniques, we exploit redundancies in large deep networks to free up parameters that can then be employed to learn new tasks. By performing iterative pruning and network re-training, we are able to sequentially "pack" multiple tasks into a single network while ensuring minimal drop in performance and minimal storage overhead. Unlike prior work that uses proxy losses to maintain accuracy on older tasks, we always optimize for the task at hand. We perform extensive experiments on a variety of network architectures and large-scale datasets, and observe much better robustness against catastrophic forgetting than prior work. In particular, we are able to add three fine-grained classification tasks to a single ImageNet-trained VGG-16 network and achieve accuracies close to those of separately trained networks for each task. Code available at https://github.com/arunmallya/packnet

Arun Mallya, Svetlana Lazebnik• 2017

Related benchmarks

TaskDatasetResultRank
Image ClassificationStanford Cars--
477
ClassificationCars
Accuracy86.1
314
Image ClassificationStanford Cars (test)
Accuracy86.11
306
Image ClassificationCUB-200-2011 (test)
Top-1 Acc80.31
276
Image ClassificationCUB
Accuracy80.4
249
Image ClassificationOxford Flowers-102 (test)
Top-1 Accuracy93.04
131
Image ClassificationFlowers
Accuracy93
127
Image ClassificationImageNet (val)
Accuracy75.71
115
Medical Image SegmentationLA
Dice88.87
97
Image ClassificationPlaces365--
62
Showing 10 of 40 rows

Other info

Code

Follow for update