Share your thoughts, 1 month free Claude Pro on usSee more
WorkDL logo mark

Initialization-Aware Score-Based Diffusion Sampling

About

Score-based generative models (SGMs) aim at generating samples from a target distribution by approximating the reverse-time dynamics of a stochastic differential equation. Despite their strong empirical performance, classical samplers initialized from a Gaussian distribution require a long time horizon noising typically inducing a large number of discretization steps and high computational cost. In this work, we present a Kullback-Leibler convergence analysis of Variance Exploding diffusion samplers that highlights the critical role of the backward process initialization. Based on this result, we propose a theoretically grounded sampling strategy that learns the reverse-time initialization, directly minimizing the initialization error. The resulting procedure is independent of the specific score training procedure, network architecture, and discretization scheme. Experiments on toy distributions and benchmark datasets demonstrate competitive or improved generative quality while using significantly fewer sampling steps.

Tiziano Fassina, Gabriel Cardoso, Sylvan Le Corff, Thomas Romary• 2026

Related benchmarks

TaskDatasetResultRank
Image GenerationFFHQ 64x64 (test)
FID2.47
82
Generative ModelingGMM (d=40)
MaxSWD0.1
18
Generative ModelingGMM d=100
MaxSWD0.15
9
Generative ModelingHT d=100
MaxSWD0.529
9
Image GenerationImageNet dogs 512x512 (train)
FID5.53
3
Image GenerationImageNet-birds 50-class 512
FID3.38
3
Showing 6 of 6 rows

Other info

Follow for update