Categorical Reparameterization with Gumbel-Softmax
About
Categorical variables are a natural choice for representing discrete structure in the world. However, stochastic neural networks rarely use categorical latent variables due to the inability to backpropagate through samples. In this work, we present an efficient gradient estimator that replaces the non-differentiable sample from a categorical distribution with a differentiable sample from a novel Gumbel-Softmax distribution. This distribution has the essential property that it can be smoothly annealed into a categorical distribution. We show that our Gumbel-Softmax estimator outperforms state-of-the-art gradient estimators on structured output prediction and unsupervised generative modeling tasks with categorical latent variables, and enables large speedups on semi-supervised classification.
Related benchmarks
| Task | Dataset | Result | Rank | |
|---|---|---|---|---|
| Image Generation | CelebA-HQ | FID5.47 | 79 | |
| Image Generation | FFHQ (test) | FID7.97 | 77 | |
| Image Generation | LSUN Bedroom v1 (test) | FID23 | 56 | |
| Image Generation | AFHQ v1 (test) | FID14.4 | 56 | |
| Image Generation | LSUN Church v1 (test) | FID13.7 | 55 | |
| Variational Inference | MNIST (test) | Negative ELBO101.4 | 52 | |
| Generative Modeling | MNIST (train) | ELBO127.6 | 51 | |
| Image Generation | Fashion MNIST | -- | 38 | |
| Image Reconstruction | MNIST | MSE0.0222 | 34 | |
| Speech Decompression | VCTK (test) | Log Spectral Distance1.13 | 28 |