Share your thoughts, 1 month free Claude Pro on usSee more
WorkDL logo mark

Net2Net: Accelerating Learning via Knowledge Transfer

About

We introduce techniques for rapidly transferring the information stored in one neural net into another neural net. The main purpose is to accelerate the training of a significantly larger neural net. During real-world workflows, one often trains very many different neural networks during the experimentation and design process. This is a wasteful process in which each new model is trained from scratch. Our Net2Net technique accelerates the experimentation process by instantaneously transferring the knowledge from a previous network to each new deeper or wider network. Our techniques are based on the concept of function-preserving transformations between neural network specifications. This differs from previous approaches to pre-training that altered the function represented by a neural net when adding layers to it. Using our knowledge transfer mechanism to add depth to Inception modules, we demonstrate a new state of the art accuracy rating on the ImageNet dataset.

Tianqi Chen, Ian Goodfellow, Jonathon Shlens• 2015

Related benchmarks

TaskDatasetResultRank
Image ClassificationCIFAR-100 (test)
Accuracy76.48
3518
Image ClassificationCIFAR-10 (test)
Accuracy91.78
3381
Image ClassificationImageNet--
184
RegressionCalifornia Housing
MSE0.391
71
Image ClassificationMNIST (train)
Train Accuracy98.99
53
2D heat conductivity inversion2D heat equation S=1500
Best Relative Error36.2
9
2D heat conductivity inversion2D heat equation S=1000
Best Relative Error0.4
9
Wind velocity reconstructionWind velocity reconstruction S = 1000
Best Error13.28
7
Wind velocity reconstructionWind velocity reconstruction S = 5000
Best Error4.6
7
Image ClassificationMNIST S=600 (train)
Best Accuracy87.64
7
Showing 10 of 12 rows

Other info

Follow for update