Share your thoughts, 1 month free Claude Pro on usSee more
WorkDL logo mark

FAST-DIPS: Adjoint-Free Analytic Steps and Hard-Constrained Likelihood Correction for Diffusion-Prior Inverse Problems

About

Training-free diffusion priors enable inverse-problem solvers without retraining, but for nonlinear forward operators data consistency often relies on repeated derivatives or inner optimization/MCMC loops with conservative step sizes, incurring many iterations and denoiser/score evaluations. We propose a training-free solver that replaces these inner loops with a hard measurement-space feasibility constraint (closed-form projection) and an analytic, model-optimal step size, enabling a small, fixed compute budget per noise level. Anchored at the denoiser prediction, the correction is approximated via an adjoint-free, ADMM-style splitting with projection and a few steepest-descent updates, using one VJP and either one JVP or a forward-difference probe, followed by backtracking and decoupled re-annealing. We prove local model optimality and descent under backtracking for the step-size rule, and derive an explicit KL bound for mode-substitution re-annealing under a local Gaussian conditional surrogate. We also develop a latent variant and a one-parameter pixel$\rightarrow$latent hybrid schedule. Experiments achieve competitive PSNR/SSIM/LPIPS with up to 19.5$\times$ speedup, without hand-coded adjoints or inner MCMC.

Minwoo Kim, Seunghyeok Shin, Hongki Lim• 2026

Related benchmarks

TaskDatasetResultRank
Gaussian DeblurringFFHQ
PSNR29.406
34
Gaussian DeblurringImageNet
SSIM0.705
32
Super-Resolution (4x)ImageNet
PSNR26.367
30
Motion DeblurringImageNet
SSIM0.799
27
Phase RetrievalFFHQ
PSNR29.253
26
Inpaint (box)ImageNet
PSNR21.381
26
HDRFFHQ
PSNR26.275
25
Motion DeblurringFFHQ
PSNR31.736
22
HDRImageNet
PSNR24.522
21
Nonlinear DeblurFFHQ
PSNR28.746
20
Showing 10 of 17 rows

Other info

Follow for update