Our new X account is live! Follow @wizwand_team for updates
WorkDL logo mark

Paraphrasing Complex Network: Network Compression via Factor Transfer

About

Many researchers have sought ways of model compression to reduce the size of a deep neural network (DNN) with minimal performance degradation in order to use DNNs in embedded systems. Among the model compression methods, a method called knowledge transfer is to train a student network with a stronger teacher network. In this paper, we propose a novel knowledge transfer method which uses convolutional operations to paraphrase teacher's knowledge and to translate it for the student. This is done by two convolutional modules, which are called a paraphraser and a translator. The paraphraser is trained in an unsupervised manner to extract the teacher factors which are defined as paraphrased information of the teacher network. The translator located at the student network extracts the student factors and helps to translate the teacher factors by mimicking them. We observed that our student network trained with the proposed factor transfer method outperforms the ones trained with conventional knowledge transfer methods.

Jangho Kim, SeongUk Park, Nojun Kwak• 2018

Related benchmarks

TaskDatasetResultRank
Image ClassificationCIFAR-100 (test)
Accuracy73.25
3518
Image ClassificationImageNet-1k (val)
Top-1 Accuracy69.88
1453
Image ClassificationImageNet (val)
Top-1 Acc69.88
1206
Image ClassificationCIFAR-100--
622
Image ClassificationCIFAR100 (test)
Top-1 Accuracy75.15
377
Image ClassificationILSVRC 2012 (val)
Top-1 Accuracy71.56
156
Image ClassificationCIFAR100 (test)
Test Accuracy75.18
147
Linear ClassificationSTL10 (test)
Accuracy0.7356
8
Linear ClassificationTinyImageNet (test)
Accuracy33.69
8
Showing 9 of 9 rows

Other info

Follow for update