Our new X account is live! Follow @wizwand_team for updates
WorkDL logo mark

Iterative Importance Fine-tuning of Diffusion Models

About

Diffusion models are an important tool for generative modelling, serving as effective priors in applications such as imaging and protein design. A key challenge in applying diffusion models for downstream tasks is efficiently sampling from resulting posterior distributions, which can be addressed using Doob's $h$-transform. This work introduces a self-supervised algorithm for fine-tuning diffusion models by learning the optimal control, enabling amortised conditional sampling. Our method iteratively refines the control using a synthetic dataset resampled with path-based importance weights. We demonstrate the effectiveness of this framework on class-conditional sampling, inverse problems and reward fine-tuning for text-to-image diffusion models.

Alexander Denker, Shreyas Padhy, Francisco Vargas, Johannes Hertrich• 2025

Related benchmarks

TaskDatasetResultRank
Compositional generationMNIST digit 9
Accuracy98.44
6
Compositional generationMNIST digit 0
Accuracy98.54
6
Compositional generationMNIST digit 3
Classification Accuracy97.17
6
Showing 3 of 3 rows

Other info

Follow for update