Share your thoughts, 1 month free Claude Pro on usSee more
WorkDL logo mark

Relational Knowledge Distillation

About

Knowledge distillation aims at transferring knowledge acquired in one model (a teacher) to another model (a student) that is typically smaller. Previous approaches can be expressed as a form of training the student to mimic output activations of individual data examples represented by the teacher. We introduce a novel approach, dubbed relational knowledge distillation (RKD), that transfers mutual relations of data examples instead. For concrete realizations of RKD, we propose distance-wise and angle-wise distillation losses that penalize structural differences in relations. Experiments conducted on different tasks show that the proposed method improves educated student models with a significant margin. In particular for metric learning, it allows students to outperform their teachers' performance, achieving the state of the arts on standard benchmark datasets.

Wonpyo Park, Dongju Kim, Yan Lu, Minsu Cho• 2019

Related benchmarks

TaskDatasetResultRank
Image ClassificationCIFAR-100 (test)
Accuracy74.62
3518
Image ClassificationImageNet-1K 1.0 (val)
Top-1 Accuracy71.23
1952
Image ClassificationImageNet-1k (val)
Top-1 Accuracy70.4
1469
Image ClassificationImageNet-1K
Top-1 Acc69.86
1239
Image ClassificationImageNet (val)
Top-1 Acc70.4
1206
Image ClassificationCIFAR-100 (val)
Accuracy77.62
776
Image ClassificationCIFAR-100--
691
Image ClassificationDTD
Accuracy68.08
542
Image ClassificationTinyImageNet (test)
Accuracy37.21
440
Image ClassificationCIFAR100 (test)
Top-1 Accuracy75.4
407
Showing 10 of 66 rows

Other info

Follow for update