Our new X account is live! Follow @wizwand_team for updates
WorkDL logo mark

Improved Techniques for Training Score-Based Generative Models

About

Score-based generative models can produce high quality image samples comparable to GANs, without requiring adversarial optimization. However, existing training procedures are limited to images of low resolution (typically below 32x32), and can be unstable under some settings. We provide a new theoretical analysis of learning and sampling from score models in high dimensional spaces, explaining existing failure modes and motivating new solutions that generalize across datasets. To enhance stability, we also propose to maintain an exponential moving average of model weights. With these improvements, we can effortlessly scale score-based generative models to images with unprecedented resolutions ranging from 64x64 to 256x256. Our score-based models can generate high-fidelity samples that rival best-in-class GANs on various image datasets, including CelebA, FFHQ, and multiple LSUN categories.

Yang Song, Stefano Ermon• 2020

Related benchmarks

TaskDatasetResultRank
Image GenerationCIFAR-10 (test)
FID10.87
471
Unconditional Image GenerationCIFAR-10 (test)
FID10.87
216
Image GenerationCelebA 64 x 64 (test)
FID10.23
203
Unconditional Image GenerationCIFAR-10 unconditional
FID31.75
159
Image GenerationCIFAR-10
FID10.87
25
Unconditional image synthesisCIFAR-10 32x32 (test)
FID10.9
12
Image GenerationCelebA 64x64 Unconditional (test)
FID10.2
11
Showing 7 of 7 rows

Other info

Follow for update