Our new X account is live! Follow @wizwand_team for updates
WorkDL logo mark

Is Synthetic Data From Diffusion Models Ready for Knowledge Distillation?

About

Diffusion models have recently achieved astonishing performance in generating high-fidelity photo-realistic images. Given their huge success, it is still unclear whether synthetic images are applicable for knowledge distillation when real images are unavailable. In this paper, we extensively study whether and how synthetic images produced from state-of-the-art diffusion models can be used for knowledge distillation without access to real images, and obtain three key conclusions: (1) synthetic data from diffusion models can easily lead to state-of-the-art performance among existing synthesis-based distillation methods, (2) low-fidelity synthetic images are better teaching materials, and (3) relatively weak classifiers are better teachers. Code is available at https://github.com/zhengli97/DM-KD.

Zheng Li, Yuxuan Li, Penghai Zhao, Renjie Song, Xiang Li, Jian Yang• 2023

Related benchmarks

TaskDatasetResultRank
Image ClassificationImageNet-1K 1.0 (val)
Top-1 Accuracy72.43
1866
Image ClassificationCIFAR-100 (val)
Accuracy76.58
661
Image ClassificationImageNet-100 12 (test)
Accuracy85.9
3
Image ClassificationFlowers-102 42 (test)
Accuracy85.4
3
Showing 4 of 4 rows

Other info

Code

Follow for update