Our new X account is live! Follow @wizwand_team for updates
WorkDL logo mark

Hierarchical VAE with a Diffusion-based VampPrior

About

Deep hierarchical variational autoencoders (VAEs) are powerful latent variable generative models. In this paper, we introduce Hierarchical VAE with Diffusion-based Variational Mixture of the Posterior Prior (VampPrior). We apply amortization to scale the VampPrior to models with many stochastic layers. The proposed approach allows us to achieve better performance compared to the original VampPrior work and other deep hierarchical VAEs, while using fewer parameters. We empirically validate our method on standard benchmark datasets (MNIST, OMNIGLOT, CIFAR10) and demonstrate improved training stability and latent space utilization.

Anna Kuzina, Jakub M. Tomczak• 2024

Related benchmarks

TaskDatasetResultRank
Generative ModelingMNIST (test)--
35
Generative ModelingOmniglot (test)--
8
Showing 2 of 2 rows

Other info

Code

Follow for update