Share your thoughts, 1 month free Claude Pro on usSee more
WorkDL logo mark

Stochastic Generative Plug-and-Play Priors

About

Plug-and-play (PnP) methods are widely used for solving imaging inverse problems by incorporating a denoiser into optimization algorithms. Score-based diffusion models (SBDMs) have recently demonstrated strong generative performance through a denoiser trained across a wide range of noise levels. Despite their shared reliance on denoisers, it remains unclear how to systematically use SBDMs as priors within the PnP framework without relying on reverse diffusion sampling. In this paper, we establish a score-based interpretation of PnP that justifies using pretrained SBDMs directly within PnP algorithms. Building on this connection, we introduce a stochastic generative PnP (SGPnP) framework that injects noise to better leverage the expressive generative SBDM priors, thereby improving robustness in severely ill-posed inverse problems. We provide a new theory showing that this noise injection induces optimization on a Gaussian-smoothed objective and promotes escape from strict saddle points. Experiments on challenging inverse tasks, such as multi-coil MRI reconstruction and large-mask natural image inpainting, demonstrate consistent improvement over conventional PnP methods and achieve performance competitive with diffusion-based solvers.

Chicago Y. Park, Edward P. Chandler, Yuyang Hu, Michael T. McCann, Cristina Garcia-Cardona, Brendt Wohlberg, Ulugbek S. Kamilov• 2026

Related benchmarks

TaskDatasetResultRank
Image InpaintingFFHQ (test)
LPIPS0.108
54
Super-ResolutionFFHQ (test)
LPIPS0.194
18
Accelerated MRIfastMRI (test)
PSNR32.54
14
DeblurringFFHQ (test)
PSNR34.52
6
Showing 4 of 4 rows

Other info

Follow for update