Our new X account is live! Follow @wizwand_team for updates
WorkDL logo mark

Distance-Based Regularisation of Deep Networks for Fine-Tuning

About

We investigate approaches to regularisation during fine-tuning of deep neural networks. First we provide a neural network generalisation bound based on Rademacher complexity that uses the distance the weights have moved from their initial values. This bound has no direct dependence on the number of weights and compares favourably to other bounds when applied to convolutional networks. Our bound is highly relevant for fine-tuning, because providing a network with a good initialisation based on transfer learning means that learning can modify the weights less, and hence achieve tighter generalisation. Inspired by this, we develop a simple yet effective fine-tuning algorithm that constrains the hypothesis class to a small sphere centred on the initial pre-trained weights, thus obtaining provably better generalisation performance than conventional transfer learning. Empirical evaluation shows that our algorithm works well, corroborating our theoretical results. It outperforms both state of the art fine-tuning competitors, and penalty-based alternatives that we show do not directly constrain the radius of the search space.

Henry Gouk, Timothy M. Hospedales, Massimiliano Pontil• 2020

Related benchmarks

TaskDatasetResultRank
Image ClassificationCUB-200-2011 (test)
Top-1 Acc81.23
276
Image ClassificationFGVC-Aircraft (test)--
231
Image ClassificationOxford Flowers-102 (test)
Top-1 Accuracy93.23
131
Image ClassificationStanford Dogs (test)
Top-1 Acc86.48
85
Image ClassificationCaltech-256 (test)
Top-1 Acc83.25
59
Image ClassificationCars (test)--
57
Few-shot Image ClassificationminiImageNet meta (test)--
46
Image ClassificationMIT-67 (MIT-Indoor) (test)
Top-1 Acc77.31
45
Medical Image ClassificationChestX-ray14
Mean AUROC0.8235
18
Image ClassificationDomainNet Source: Real 100% data (test)
Accuracy (Real)77.19
15
Showing 10 of 31 rows

Other info

Follow for update