Our new X account is live! Follow @wizwand_team for updates
WorkDL logo mark

Diffusion Generative Flow Samplers: Improving learning signals through partial trajectory optimization

About

We tackle the problem of sampling from intractable high-dimensional density functions, a fundamental task that often appears in machine learning and statistics. We extend recent sampling-based approaches that leverage controlled stochastic processes to model approximate samples from these target densities. The main drawback of these approaches is that the training objective requires full trajectories to compute, resulting in sluggish credit assignment issues due to use of entire trajectories and a learning signal present only at the terminal time. In this work, we present Diffusion Generative Flow Samplers (DGFS), a sampling-based framework where the learning process can be tractably broken down into short partial trajectory segments, via parameterizing an additional "flow function". Our method takes inspiration from the theory developed for generative flow networks (GFlowNets), allowing us to make use of intermediate learning signals. Through various challenging experiments, we demonstrate that DGFS achieves more accurate estimates of the normalization constant than closely-related prior methods.

Dinghuai Zhang, Ricky T. Q. Chen, Cheng-Hao Liu, Aaron Courville, Yoshua Bengio• 2023

Related benchmarks

TaskDatasetResultRank
Unconditional modelingFunnel d = 10
Delta log Z0.527
30
Unconditional modeling25GMM d = 2
Delta Log Z1.127
30
Unconditional modelingManywell d = 32
Δ log Z4.23
29
Conditional SamplingMNIST pretrained VAE decoder (test)
log Z-111.5
15
Unconditional modelingLog-Gaussian Cox process d = 1600
Delta log Z465.4
13
Showing 5 of 5 rows

Other info

Follow for update