Share your thoughts, 1 month free Claude Pro on usSee more
WorkDL logo mark

Jacobian-aware Posterior Sampling for Inverse Problems

About

Diffusion models provide powerful generative priors for solving inverse problems by sampling from a posterior distribution conditioned on corrupted measurements. Existing methods primarily follow two paradigms: direct methods, which approximate the likelihood term, and proximal methods, which incorporate intermediate solutions satisfying measurement constraints into the sampling process. We demonstrate that these approaches differ fundamentally in their treatment of the diffusion denoiser's Jacobian within the likelihood term. While this Jacobian encodes critical prior knowledge of the data distribution, training-induced non-idealities can degrade performance in zero-shot settings. In this work, we bridge direct and proximal approaches by proposing a principled Jacobian-Aware Posterior Sampler (JAPS). JAPS leverages the Jacobian's prior knowledge while mitigating its detrimental effects through a corresponding proximal solution, requiring no additional computational cost. Our method enhances reconstruction quality across diverse linear and nonlinear noisy imaging tasks, outperforming existing diffusion-based baselines in perceptual quality while maintaining or improving distortion metrics.

Liav Hen, Tom Tirer, Raja Giryes, Shady Abu-Hussein• 2025

Related benchmarks

TaskDatasetResultRank
SuperresolutionCelebA-HQ (test)
PSNR28.39
32
Box InpaintingImageNet 256x256 (test)
PSNR19.44
14
Inpainting (Box)CelebA-HQ (test)
PSNR27.58
13
DeblurringImageNet 256
PSNR23.64
11
Nonlinear BlurCelebA-HQ
PSNR26.4
8
Gaussian BlurImageNet-256 (test)
PSNR27.16
7
Motion BlurCelebA-HQ (test)
PSNR28.7
7
Motion BlurImageNet-256 (test)
PSNR25.31
7
Random InpaintingImageNet-256 (test)
PSNR23.71
7
Super-ResolutionImageNet-256 (test)
PSNR24.55
7
Showing 10 of 13 rows

Other info

Follow for update