Our new X account is live! Follow @wizwand_team for updates
WorkDL logo mark

Posterior Mean Matching: Generative Modeling through Online Bayesian Inference

About

This paper introduces posterior mean matching (PMM), a new method for generative modeling that is grounded in Bayesian inference. PMM uses conjugate pairs of distributions to model complex data of various modalities like images and text, offering a flexible alternative to existing methods like diffusion models. PMM models iteratively refine noisy approximations of the target distribution using updates from online Bayesian inference. PMM is flexible because its mechanics are based on general Bayesian models. We demonstrate this flexibility by developing specialized examples: a generative PMM model of real-valued data using the Normal-Normal model, a generative PMM model of count data using a Gamma-Poisson model, and a generative PMM model of discrete data using a Dirichlet-Categorical model. For the Normal-Normal PMM model, we establish a direct connection to diffusion models by showing that its continuous-time formulation converges to a stochastic differential equation (SDE). Additionally, for the Gamma-Poisson PMM, we derive a novel SDE driven by a Cox process, which is a significant departure from traditional Brownian motion-based generative models. PMMs achieve performance that is competitive with generative models for language modeling and image generation.

Sebastian Salazar, Michal Kucer, Yixin Wang, Emily Casleton, David Blei• 2024

Related benchmarks

TaskDatasetResultRank
Image GenerationFFHQ 64x64 (test)
FID3.41
69
Language ModelingOpenWebText
Perplexity42.58
50
Language Modelingtext8
BPC1.29
23
Image GenerationCIFAR-10 2009 (test)
FID2.18
13
Image GenerationAFHQ 64 v2 2020 (test)
FID2.48
10
Showing 5 of 5 rows

Other info

Code

Follow for update