Share your thoughts, 1 month free Claude Pro on usSee more
WorkDL logo mark

An Expectation-Maximization Algorithm for Training Clean Diffusion Models from Corrupted Observations

About

Diffusion models excel in solving imaging inverse problems due to their ability to model complex image priors. However, their reliance on large, clean datasets for training limits their practical use where clean data is scarce. In this paper, we propose EMDiffusion, an expectation-maximization (EM) approach to train diffusion models from corrupted observations. Our method alternates between reconstructing clean images from corrupted data using a known diffusion model (E-step) and refining diffusion model weights based on these reconstructions (M-step). This iterative process leads the learned diffusion model to gradually converge to the true clean data distribution. We validate our method through extensive experiments on diverse computational imaging tasks, including random inpainting, denoising, and deblurring, achieving new state-of-the-art performance.

Weimin Bai, Yifei Wang, Wenzheng Chen, He Sun• 2024

Related benchmarks

TaskDatasetResultRank
Image GenerationCIFAR-10 32x32
FID86.47
147
Image Distribution RecoveryCIFAR-10 (test)
FID21.08
15
Image DeblurringCelebA (test)
PSNR23.74
14
Image DenoisingCIFAR-10 (test)
PSNR23.16
13
DenoisingCIFAR-10 32x32
FID86.47
13
Image Distribution RecoveryCelebA official tool with DDIM (test)
FID59.04
8
Image InpaintingCIFAR-10 (test)
PSNR24.7
5
Gaussian DeblurringCelebA-HQ σ=0.2 (noisy)
FID51.33
4
Gaussian DeblurringCelebA-HQ noiseless σ=0.0
FID56.69
4
Random Inpainting (p=0.9)CelebA-HQ σ=0.2 (noisy)
FID165.6
4
Showing 10 of 13 rows

Other info

Follow for update