Share your thoughts, 1 month free Claude Pro on usSee more
WorkDL logo mark

Guess & Guide: Gradient-Free Zero-Shot Diffusion Guidance

About

Pretrained diffusion models serve as effective priors for Bayesian inverse problems. These priors enable zero-shot generation by sampling from the conditional distribution, which avoids the need for task-specific retraining. However, a major limitation of existing methods is their reliance on surrogate likelihoods that require vector-Jacobian products at each denoising step, creating a substantial computational burden. To address this, we introduce a lightweight likelihood surrogate that eliminates the need to calculate gradients through the denoiser network. This enables us to handle diverse inverse problems without backpropagation overhead. Experiments confirm that using our method, the inference cost drops dramatically. At the same time, our approach delivers the highest results in multiple tasks. Broadly speaking, we propose the fastest and Pareto optimal method for Bayesian inverse problems.

Abduragim Shtanchaev, Albina Ilina, Yazid Janati, Arip Asadulaev, Martin Tak\'ac, Eric Moulines• 2026

Related benchmarks

TaskDatasetResultRank
Gaussian DeblurringFFHQ
PSNR29
34
Gaussian DeblurringImageNet
SSIM0.77
32
Super-Resolution (4x)ImageNet
PSNR26.6
30
Motion DeblurringImageNet
SSIM0.72
27
Phase RetrievalFFHQ
PSNR19.7
26
Inpaint (box)ImageNet
PSNR18.3
26
HDRFFHQ
PSNR22.9
25
Motion DeblurringFFHQ
PSNR28.1
22
HDRImageNet
PSNR23.2
21
Gaussian deblurImageNet
PSNR27.3
19
Showing 10 of 43 rows

Other info

Follow for update