Our new X account is live! Follow @wizwand_team for updates
WorkDL logo mark

Denoising Diffusion Samplers

About

Denoising diffusion models are a popular class of generative models providing state-of-the-art results in many domains. One adds gradually noise to data using a diffusion to transform the data distribution into a Gaussian distribution. Samples from the generative model are then obtained by simulating an approximation of the time-reversal of this diffusion initialized by Gaussian samples. Practically, the intractable score terms appearing in the time-reversed process are approximated using score matching techniques. We explore here a similar idea to sample approximately from unnormalized probability density functions and estimate their normalizing constants. We consider a process where the target density diffuses towards a Gaussian. Denoising Diffusion Samplers (DDS) are obtained by approximating the corresponding time-reversal. While score matching is not applicable in this context, we can leverage many of the ideas introduced in generative modeling for Monte Carlo sampling. Existing theoretical results from denoising diffusion models also provide theoretical guarantees for DDS. We discuss the connections between DDS, optimal control and Schr\"odinger bridges and finally demonstrate DDS experimentally on a variety of challenging sampling tasks.

Francisco Vargas, Will Grathwohl, Arnaud Doucet• 2023

Related benchmarks

TaskDatasetResultRank
Unconditional modelingFunnel d = 10
Delta log Z0.424
30
Unconditional modeling25GMM d = 2
Delta Log Z1.76
30
Unconditional modelingManywell d = 32
Δ log Z7.36
29
Unconditional modelingLog-Gaussian Cox process d = 1600
Delta log Z471.6
13
Bayesian InferenceCredit 25D
ELBO-514.7
6
Bayesian InferenceSeeds 26D
ELBO-75.21
6
Bayesian InferenceCancer 31D
ELBO20
6
Bayesian InferenceIonosphere 35D
ELBO-114.2
6
Bayesian InferenceSonar 61D
ELBO-121.2
6
Boltzmann Distribution SamplingLJ-13
E(·) W224.61
6
Showing 10 of 12 rows

Other info

Follow for update