Our new X account is live! Follow @wizwand_team for updates
WorkDL logo mark

Learning From Multiple Experts: Self-paced Knowledge Distillation for Long-tailed Classification

About

In real-world scenarios, data tends to exhibit a long-tailed distribution, which increases the difficulty of training deep networks. In this paper, we propose a novel self-paced knowledge distillation framework, termed Learning From Multiple Experts (LFME). Our method is inspired by the observation that networks trained on less imbalanced subsets of the distribution often yield better performances than their jointly-trained counterparts. We refer to these models as 'Experts', and the proposed LFME framework aggregates the knowledge from multiple 'Experts' to learn a unified student model. Specifically, the proposed framework involves two levels of adaptive learning schedules: Self-paced Expert Selection and Curriculum Instance Selection, so that the knowledge is adaptively transferred to the 'Student'. We conduct extensive experiments and demonstrate that our method is able to achieve superior performances compared to state-of-the-art methods. We also show that our method can be easily plugged into state-of-the-art long-tailed classification algorithms for further improvements.

Liuyu Xiang, Guiguang Ding, Jungong Han• 2020

Related benchmarks

TaskDatasetResultRank
Image ClassificationImageNet LT
Top-1 Accuracy38.8
251
Long-Tailed Image ClassificationImageNet-LT (test)
Top-1 Acc (Overall)38.8
220
Image ClassificationImageNet-LT (test)--
159
Image ClassificationPlaces-LT (test)
Accuracy (Medium)39.6
128
Image ClassificationCIFAR-100-LT IF 100 (test)
Top-1 Acc43.8
77
Image ClassificationImageNet-LT (val)
Top-1 Acc (Total)37.2
72
Image ClassificationCIFAR-100-LT Imbalance Ratio 100 (test)
Accuracy42.3
62
Image ClassificationCIFAR-100-LT Imbalance Factor 100 (test)
Top-1 Accuracy43.8
44
Image ClassificationCIFAR100 LT-100 1.0 (test)
Top-1 Acc (All)43.8
35
Image ClassificationImageNet LT 2018 (test)
Top-1 Acc38.8
34
Showing 10 of 13 rows

Other info

Follow for update