Our new X account is live! Follow @wizwand_team for updates
WorkDL logo mark

One-Step Diffusion Samplers via Self-Distillation and Deterministic Flow

About

Sampling from unnormalized target distributions is a fundamental yet challenging task in machine learning and statistics. Existing sampling algorithms typically require many iterative steps to produce high-quality samples, leading to high computational costs. We introduce one-step diffusion samplers which learn a step-conditioned ODE so that one large step reproduces the trajectory of many small ones via a state-space consistency loss. We further show that standard ELBO estimates in diffusion samplers degrade in the few-step regime because common discrete integrators yield mismatched forward/backward transition kernels. Motivated by this analysis, we derive a deterministic-flow (DF) importance weight for ELBO estimation without a backward kernel. To calibrate DF, we introduce a volume-consistency regularization that aligns the accumulated volume change along the flow across step resolutions. Our proposed sampler therefore achieves both fast sampling and stable evidence estimate in only one or few steps. Across challenging synthetic and Bayesian benchmarks, it achieves competitive sample quality with orders-of-magnitude fewer network evaluations while maintaining robust ELBO estimates.

Pascal Jutras-Dube, Jiaru Zhang, Ziran Wang, Ruqi Zhang• 2025

Related benchmarks

TaskDatasetResultRank
Bayesian InferenceCredit 25D
ELBO-501.7
6
Bayesian InferenceSeeds 26D
ELBO-44.35
6
Bayesian InferenceIonosphere 35D
ELBO-86.68
6
Bayesian InferenceSonar 61D
ELBO-50.58
6
Bayesian InferenceCancer 31D
ELBO8.66
6
Bayesian InferenceBrownian 32D
ELBO-11.37
5
Showing 6 of 6 rows

Other info

Follow for update