Our new X account is live! Follow @wizwand_team for updates
WorkDL logo mark

Diffusion Bridge Implicit Models

About

Denoising diffusion bridge models (DDBMs) are a powerful variant of diffusion models for interpolating between two arbitrary paired distributions given as endpoints. Despite their promising performance in tasks like image translation, DDBMs require a computationally intensive sampling process that involves the simulation of a (stochastic) differential equation through hundreds of network evaluations. In this work, we take the first step in fast sampling of DDBMs without extra training, motivated by the well-established recipes in diffusion models. We generalize DDBMs via a class of non-Markovian diffusion bridges defined on the discretized timesteps concerning sampling, which share the same marginal distributions and training objectives, give rise to generative processes ranging from stochastic to deterministic, and result in diffusion bridge implicit models (DBIMs). DBIMs are not only up to 25$\times$ faster than the vanilla sampler of DDBMs but also induce a novel, simple, and insightful form of ordinary differential equation (ODE) which inspires high-order numerical solvers. Moreover, DBIMs maintain the generation diversity in a distinguished way, by using a booting noise in the initial sampling step, which enables faithful encoding, reconstruction, and semantic interpolation in image translation tasks. Code is available at https://github.com/thu-ml/DiffusionBridge.

Kaiwen Zheng, Guande He, Jianfei Chen, Fan Bao, Jun Zhu• 2024

Related benchmarks

TaskDatasetResultRank
T1 to T2 MRI translationIXI (test)
PSNR20.43
14
Semantic TranslationPSCDE
Dice16.7
6
Cross-modal Image TranslationSentinel SAR→Optical
SSIM0.14
6
Cross-modal Image TranslationIXI T2→T1
SSIM0.33
6
Showing 4 of 4 rows

Other info

Follow for update