Share your thoughts, 1 month free Claude Pro on usSee more
WorkDL logo mark

Towards Principled Dataset Distillation: A Spectral Distribution Perspective

About

Dataset distillation (DD) aims to compress large-scale datasets into compact synthetic counterparts for efficient model training. However, existing DD methods exhibit substantial performance degradation on long-tailed datasets. We identify two fundamental challenges: heuristic design choices for distribution discrepancy measure and uniform treatment of imbalanced classes. To address these limitations, we propose Class-Aware Spectral Distribution Matching (CSDM), which reformulates distribution alignment via the spectrum of a well-behaved kernel function. This technique maps the original samples into frequency space, resulting in the Spectral Distribution Distance (SDD). To mitigate class imbalance, we exploit the unified form of SDD to perform amplitude-phase decomposition, which adaptively prioritizes the realism in tail classes. On CIFAR-10-LT, with 10 images per class, CSDM achieves a 14.0% improvement over state-of-the-art DD methods, with only a 5.7% performance drop when the number of images in tail classes decreases from 500 to 25, demonstrating strong stability on long-tailed data.

Ruixi Wu, Shaobo Wang, Jiahuan Chen, Zhiyuan Liu, Yicun Yang, Zhaorun Chen, Zekai Li, Kaixin Li, Xinming Wang, Hongzhu Yi, Kai Wang, Linfeng Zhang• 2026

Related benchmarks

TaskDatasetResultRank
Image ClassificationCIFAR-100
Accuracy54.1
691
Image ClassificationCIFAR-10
Accuracy78.1
246
Image ClassificationCIFAR-10-LT
Accuracy76.5
146
Image ClassificationCIFAR-100 LT
Top-1 Acc50
131
Image ClassificationImageSquawk Long-tailed ImageNet
Accuracy18.4
22
Image ClassificationImageWoof Long-tailed ImageNet Subset
Accuracy17.3
22
Image ClassificationImageMeow Long-tailed ImageNet
Accuracy23.2
22
Image ClassificationImageNette Long-tailed ImageNet
Accuracy29.5
22
Showing 8 of 8 rows

Other info

Follow for update