Our new X account is live! Follow @wizwand_team for updates
WorkDL logo mark

Improved Denoising Diffusion Probabilistic Models

About

Denoising diffusion probabilistic models (DDPM) are a class of generative models which have recently been shown to produce excellent samples. We show that with a few simple modifications, DDPMs can also achieve competitive log-likelihoods while maintaining high sample quality. Additionally, we find that learning variances of the reverse diffusion process allows sampling with an order of magnitude fewer forward passes with a negligible difference in sample quality, which is important for the practical deployment of these models. We additionally use precision and recall to compare how well DDPMs and GANs cover the target distribution. Finally, we show that the sample quality and likelihood of these models scale smoothly with model capacity and training compute, making them easily scalable. We release our code at https://github.com/openai/improved-diffusion

Alex Nichol, Prafulla Dhariwal• 2021

Related benchmarks

TaskDatasetResultRank
Image GenerationCIFAR-10 (test)
FID2.9
471
Class-conditional Image GenerationImageNet 256x256--
441
Class-conditional Image GenerationImageNet 256x256 (train)--
305
Class-conditional Image GenerationImageNet 256x256 (val)--
293
Image GenerationImageNet 256x256
FID12.26
243
Unconditional Image GenerationCIFAR-10 (test)
FID2.9
216
Class-conditional Image GenerationImageNet 256x256 (train val)
FID12.26
178
Unconditional Image GenerationCIFAR-10
FID2.9
171
Unconditional Image GenerationCIFAR-10 unconditional
FID2.9
159
Image GenerationCIFAR10 32x32 (test)
FID2.9
154
Showing 10 of 82 rows
...

Other info

Code

Follow for update