The VampPrior Mixture Model
About
Widely used deep latent variable models (DLVMs), in particular Variational Autoencoders (VAEs), employ overly simplistic priors on the latent space. To achieve strong clustering performance, existing methods that replace the standard normal prior with a Gaussian mixture model (GMM) require defining the number of clusters to be close to the number of expected ground truth classes a-priori and are susceptible to poor initializations. We leverage VampPrior concepts (Tomczak and Welling, 2018) to fit a Bayesian GMM prior, resulting in the VampPrior Mixture Model (VMM), a novel prior for DLVMs. In a VAE, the VMM attains highly competitive clustering performance on benchmark datasets. Integrating the VMM into scVI (Lopez et al., 2018), a popular scRNA-seq integration method, significantly improves its performance and automatically arranges cells into clusters with similar biological characteristics.
Related benchmarks
| Task | Dataset | Result | Rank | |
|---|---|---|---|---|
| Clustering | Fashion MNIST | NMI68.8 | 95 | |
| Clustering | MNIST (train+val) | Utilized Clusters13.9 | 8 | |
| scRNA-seq integration | cortex | Bio Conservation0.76 | 7 | |
| scRNA-seq integration | PBMC | Batch Correction88.6 | 7 | |
| scRNA-seq integration | split-seq | Batch Correction87.8 | 7 | |
| scRNA-seq integration | lung atlas | Batch Correction0.616 | 7 | |
| Clustering | MNIST | ACC0.96 | 5 | |
| Clustering | Fashion MNIST (train+val) | Utilized Clusters16.5 | 4 | |
| scRNA-seq clustering | cortex | Utilized Clusters100 | 3 | |
| scRNA-seq clustering | PBMC | Clusters Used100 | 3 |