Share your thoughts, 1 month free Claude Pro on usSee more
WorkDL logo mark

Piggyback: Adapting a Single Network to Multiple Tasks by Learning to Mask Weights

About

This work presents a method for adapting a single, fixed deep neural network to multiple tasks without affecting performance on already learned tasks. By building upon ideas from network quantization and pruning, we learn binary masks that piggyback on an existing network, or are applied to unmodified weights of that network to provide good performance on a new task. These masks are learned in an end-to-end differentiable fashion, and incur a low overhead of 1 bit per network parameter, per task. Even though the underlying network is fixed, the ability to mask individual weights allows for the learning of a large number of filters. We show performance comparable to dedicated fine-tuned networks for a variety of classification tasks, including those with large domain shifts from the initial task (ImageNet), and a variety of network architectures. Unlike prior work, we do not suffer from catastrophic forgetting or competition between tasks, and our performance is agnostic to task ordering. Code available at https://github.com/arunmallya/piggyback.

Arun Mallya, Dillon Davis, Svetlana Lazebnik• 2018

Related benchmarks

TaskDatasetResultRank
Image ClassificationCIFAR-100 (test)
Accuracy67.2
3518
Image ClassificationCIFAR-100
Accuracy79.76
691
Image ClassificationStanford Cars--
635
Image ClassificationSVHN (test)
Accuracy96.8
401
ClassificationCars
Accuracy89.62
395
Image ClassificationStanford Cars (test)
Accuracy89.62
316
Image ClassificationCUB-200-2011 (test)
Top-1 Acc81.59
286
Image ClassificationCUB
Accuracy81.59
282
Image ClassificationOxford Flowers-102 (test)
Top-1 Accuracy94.77
192
Image ClassificationFlowers
Accuracy94.76
127
Showing 10 of 34 rows

Other info

Code

Follow for update