Share your thoughts, 1 month free Claude Pro on usSee more
WorkDL logo mark

Structural Pruning for Diffusion Models

About

Generative modeling has recently undergone remarkable advancements, primarily propelled by the transformative implications of Diffusion Probabilistic Models (DPMs). The impressive capability of these models, however, often entails significant computational overhead during both training and inference. To tackle this challenge, we present Diff-Pruning, an efficient compression method tailored for learning lightweight diffusion models from pre-existing ones, without the need for extensive re-training. The essence of Diff-Pruning is encapsulated in a Taylor expansion over pruned timesteps, a process that disregards non-contributory diffusion steps and ensembles informative gradients to identify important weights. Our empirical assessment, undertaken across several datasets highlights two primary benefits of our proposed method: 1) Efficiency: it enables approximately a 50\% reduction in FLOPs at a mere 10\% to 20\% of the original training expenditure; 2) Consistency: the pruned diffusion models inherently preserve generative behavior congruent with their pre-trained models. Code is available at \url{https://github.com/VainF/Diff-Pruning}.

Gongfan Fang, Xinyin Ma, Xinchao Wang• 2023

Related benchmarks

TaskDatasetResultRank
Class-conditional Image GenerationImageNet 256x256
Inception Score (IS)201.8
815
Class-conditional Image GenerationImageNet 256x256 (train)
IS214.4
345
Image GenerationImageNet (val)
Inception Score156.4
247
Class-conditional Image GenerationImageNet 256x256 (test)
FID9.27
208
Image GenerationCIFAR10 32x32 (test)
FID1.97
183
Image GenerationCIFAR-10 32x32
FID3.73
147
Class-conditional Image GenerationImageNet 64x64 (test)
FID2.57
91
Image GenerationFFHQ 64x64 (test)
FID2.39
82
Image GenerationCelebA-64
FID2.87
75
Unconditional Image GenerationLSUN Bedroom 256x256
FID18.6
68
Showing 10 of 18 rows

Other info

Follow for update