One-Step Diffusion Samplers via Self-Distillation and Deterministic Flow
About
Sampling from unnormalized target distributions is a fundamental yet challenging task in machine learning and statistics. Existing sampling algorithms typically require many iterative steps to produce high-quality samples, leading to high computational costs. We introduce one-step diffusion samplers which learn a step-conditioned ODE so that one large step reproduces the trajectory of many small ones via a state-space consistency loss. We further show that standard ELBO estimates in diffusion samplers degrade in the few-step regime because common discrete integrators yield mismatched forward/backward transition kernels. Motivated by this analysis, we derive a deterministic-flow (DF) importance weight for ELBO estimation without a backward kernel. To calibrate DF, we introduce a volume-consistency regularization that aligns the accumulated volume change along the flow across step resolutions. Our proposed sampler therefore achieves both fast sampling and stable evidence estimate in only one or few steps. Across challenging synthetic and Bayesian benchmarks, it achieves competitive sample quality with orders-of-magnitude fewer network evaluations while maintaining robust ELBO estimates.
Related benchmarks
| Task | Dataset | Result | Rank | |
|---|---|---|---|---|
| Bayesian Inference | Credit 25D | ELBO-501.7 | 6 | |
| Bayesian Inference | Seeds 26D | ELBO-44.35 | 6 | |
| Bayesian Inference | Ionosphere 35D | ELBO-86.68 | 6 | |
| Bayesian Inference | Sonar 61D | ELBO-50.58 | 6 | |
| Bayesian Inference | Cancer 31D | ELBO8.66 | 6 | |
| Bayesian Inference | Brownian 32D | ELBO-11.37 | 5 |