Share your thoughts, 1 month free Claude Pro on usSee more
WorkDL logo mark

Dataset Condensation via Efficient Synthetic-Data Parameterization

About

The great success of machine learning with massive amounts of data comes at a price of huge computation costs and storage for training and tuning. Recent studies on dataset condensation attempt to reduce the dependence on such massive data by synthesizing a compact training dataset. However, the existing approaches have fundamental limitations in optimization due to the limited representability of synthetic datasets without considering any data regularity characteristics. To this end, we propose a novel condensation framework that generates multiple synthetic data with a limited storage budget via efficient parameterization considering data regularity. We further analyze the shortcomings of the existing gradient matching-based condensation methods and develop an effective optimization technique for improving the condensation of training data information. We propose a unified algorithm that drastically improves the quality of condensed data against the current state-of-the-art on CIFAR-10, ImageNet, and Speech Commands.

Jang-Hyun Kim, Jinuk Kim, Seong Joon Oh, Sangdoo Yun, Hwanjun Song, Joonhyun Jeong, Jung-Woo Ha, Hyun Oh Song• 2022

Related benchmarks

TaskDatasetResultRank
Image ClassificationCIFAR10 (test)
Accuracy74.5
585
Image ClassificationImageNet-100 (val)
Top-1 Accuracy30.2
205
Dataset DistillationImageWoof 1.0 (test)
Top-1 Accuracy58.3
171
Image ClassificationCIFAR-10 (test)
Test Accuracy74.5
154
Dataset DistillationCIFAR-100 (test)
Accuracy45.1
132
Image ClassificationImageNet-100 (test)
Clean Accuracy53.7
119
Dataset DistillationCIFAR-10 (test)
Accuracy74.5
79
Image ClassificationPlaces365--
67
Dataset DistillationCIFAR-100 10 images/class (test)
Accuracy45.1
30
Dataset DistillationImageWoof (val)
Accuracy48.3
27
Showing 10 of 33 rows

Other info

Follow for update