Our new X account is live! Follow @wizwand_team for updates
WorkDL logo mark

Diffusion Prior-Based Amortized Variational Inference for Noisy Inverse Problems

About

Recent studies on inverse problems have proposed posterior samplers that leverage the pre-trained diffusion models as powerful priors. These attempts have paved the way for using diffusion models in a wide range of inverse problems. However, the existing methods entail computationally demanding iterative sampling procedures and optimize a separate solution for each measurement, which leads to limited scalability and lack of generalization capability across unseen samples. To address these limitations, we propose a novel approach, Diffusion prior-based Amortized Variational Inference (DAVI) that solves inverse problems with a diffusion prior from an amortized variational inference perspective. Specifically, instead of separate measurement-wise optimization, our amortized inference learns a function that directly maps measurements to the implicit posterior distributions of corresponding clean data, enabling a single-step posterior sampling even for unseen measurements. Extensive experiments on image restoration tasks, e.g., Gaussian deblur, 4$\times$ super-resolution, and box inpainting with two benchmark datasets, demonstrate our approach's superior performance over strong baselines. Code is available at https://github.com/mlvlab/DAVI.

Sojin Lee, Dogyun Park, Inho Kong, Hyunwoo J. Kim• 2024

Related benchmarks

TaskDatasetResultRank
4x super-resolutionFFHQ 256x256
PSNR28.23
25
Gaussian deblurFFHQ 256x256
PSNR25.46
20
4x super-resolutionImageNet 256x256 (test)
PSNR26.58
14
4x super-resolutionFFHQ 256x256 (test)
PSNR28.23
9
4× Super-ResolutionFFHQ (val)
PSNR28.23
8
Gaussian DeblurringFFHQ (val)
PSNR25.46
8
DeblurringCelebA-HQ
FID26.43
8
Box InpaintingFFHQ 256x256 (centered 128x128 mask)
PSNR26.25
7
Box InpaintingImageNet 256x256 (test)
PSNR21.96
7
DenoisingFFHQ 256x256 (test)
PSNR31.72
7
Showing 10 of 18 rows

Other info

Code

Follow for update