Share your thoughts, 1 month free Claude Pro on usSee more
WorkDL logo mark

Diffusion Posterior Sampling for General Noisy Inverse Problems

About

Diffusion models have been recently studied as powerful generative inverse problem solvers, owing to their high quality reconstructions and the ease of combining existing iterative solvers. However, most works focus on solving simple linear inverse problems in noiseless settings, which significantly under-represents the complexity of real-world problems. In this work, we extend diffusion solvers to efficiently handle general noisy (non)linear inverse problems via approximation of the posterior sampling. Interestingly, the resulting posterior sampling scheme is a blended version of diffusion sampling with the manifold constrained gradient without a strict measurement consistency projection step, yielding a more desirable generative path in noisy settings compared to the previous studies. Our method demonstrates that diffusion models can incorporate various measurement noise statistics such as Gaussian and Poisson, and also efficiently handle noisy nonlinear inverse problems such as Fourier phase retrieval and non-uniform deblurring. Code available at https://github.com/DPS2022/diffusion-posterior-sampling

Hyungjin Chung, Jeongsol Kim, Michael T. Mccann, Marc L. Klasky, Jong Chul Ye• 2022

Related benchmarks

TaskDatasetResultRank
Class-conditional Image GenerationImageNet
FID193
158
Image DeblurringCBSD68 (val)
PSNR21.94
140
Super-ResolutionDIV2K (val)
PSNR23.05
91
Conditional Image GenerationCIFAR-10
FID172
77
Image ReconstructionFFHQ (val)
PSNR28.33
66
Image InpaintingFFHQ (test)
LPIPS0.176
54
Zero-Shot Posterior SamplingFFHQ 256x256 (val)
PSNR20.34
40
Zero-Shot Posterior SamplingImageNet 256x256 (val)
PSNR13.72
40
InpaintingCelebA
PSNR36.02
38
Text-to-Image GenerationShort-DrawBench 1k prompts Stable Diffusion v1.5 base (test)
R1 Score0.34
35
Showing 10 of 340 rows
...

Other info

Follow for update