Our new X account is live! Follow @wizwand_team for updates
WorkDL logo mark

Efficient parametrization of multi-domain deep neural networks

About

A practical limitation of deep neural networks is their high degree of specialization to a single task and visual domain. Recently, inspired by the successes of transfer learning, several authors have proposed to learn instead universal, fixed feature extractors that, used as the first stage of any deep network, work well for several tasks and domains simultaneously. Nevertheless, such universal features are still somewhat inferior to specialized networks. To overcome this limitation, in this paper we propose to consider instead universal parametric families of neural networks, which still contain specialized problem-specific models, but differing only by a small number of parameters. We study different designs for such parametrizations, including series and parallel residual adapters, joint adapter compression, and parameter allocations, and empirically identify the ones that yield the highest compression. We show that, in order to maximize performance, it is necessary to adapt both shallow and deep layers of a deep network, but the required changes are very small. We also show that these universal parametrization are very effective for transfer learning, where they outperform traditional fine-tuning techniques.

Sylvestre-Alvise Rebuffi, Hakan Bilen, Andrea Vedaldi• 2018

Related benchmarks

TaskDatasetResultRank
ClassificationCars
Accuracy91.89
314
Image ClassificationCUB
Accuracy83.61
249
Image ClassificationFlowers
Accuracy95.73
127
Image ClassificationVisual Decathlon Challenge 1.0 (test)
Mean Accuracy78.1
81
Image ClassificationSketch--
20
Incremental Multi-Task LearningDomainNet
Accuracy (Real)81.51
4
Joint Multi-Task LearningDomainNet
Real Accuracy75.01
3
Showing 7 of 7 rows

Other info

Follow for update