FAST-DIPS: Adjoint-Free Analytic Steps and Hard-Constrained Likelihood Correction for Diffusion-Prior Inverse Problems
About
Training-free diffusion priors enable inverse-problem solvers without retraining, but for nonlinear forward operators data consistency often relies on repeated derivatives or inner optimization/MCMC loops with conservative step sizes, incurring many iterations and denoiser/score evaluations. We propose a training-free solver that replaces these inner loops with a hard measurement-space feasibility constraint (closed-form projection) and an analytic, model-optimal step size, enabling a small, fixed compute budget per noise level. Anchored at the denoiser prediction, the correction is approximated via an adjoint-free, ADMM-style splitting with projection and a few steepest-descent updates, using one VJP and either one JVP or a forward-difference probe, followed by backtracking and decoupled re-annealing. We prove local model optimality and descent under backtracking for the step-size rule, and derive an explicit KL bound for mode-substitution re-annealing under a local Gaussian conditional surrogate. We also develop a latent variant and a one-parameter pixel$\rightarrow$latent hybrid schedule. Experiments achieve competitive PSNR/SSIM/LPIPS with up to 19.5$\times$ speedup, without hand-coded adjoints or inner MCMC.
Related benchmarks
| Task | Dataset | Result | Rank | |
|---|---|---|---|---|
| Gaussian Deblurring | FFHQ | PSNR29.406 | 34 | |
| Gaussian Deblurring | ImageNet | SSIM0.705 | 32 | |
| Super-Resolution (4x) | ImageNet | PSNR26.367 | 30 | |
| Motion Deblurring | ImageNet | SSIM0.799 | 27 | |
| Phase Retrieval | FFHQ | PSNR29.253 | 26 | |
| Inpaint (box) | ImageNet | PSNR21.381 | 26 | |
| HDR | FFHQ | PSNR26.275 | 25 | |
| Motion Deblurring | FFHQ | PSNR31.736 | 22 | |
| HDR | ImageNet | PSNR24.522 | 21 | |
| Nonlinear Deblur | FFHQ | PSNR28.746 | 20 |