Maximum Entropy Generators for Energy-Based Models
About
Maximum likelihood estimation of energy-based models is a challenging problem due to the intractability of the log-likelihood gradient. In this work, we propose learning both the energy function and an amortized approximate sampling mechanism using a neural generator network, which provides an efficient approximation of the log-likelihood gradient. The resulting objective requires maximizing entropy of the generated samples, which we perform using recently proposed nonparametric mutual information estimators. Finally, to stabilize the resulting adversarial game, we use a zero-centered gradient penalty derived as a necessary condition from the score matching literature. The proposed technique can generate sharp images with Inception and FID scores competitive with recent GAN techniques, does not suffer from mode collapse, and is competitive with state-of-the-art anomaly detection techniques.
Related benchmarks
| Task | Dataset | Result | Rank | |
|---|---|---|---|---|
| Image Generation | Stacked MNIST | Modes1.00e+3 | 32 | |
| Image Generation | CIFAR-10 | FID34.55 | 12 | |
| Out-of-Distribution Detection | CIFAR10 vs. SVHN | AUROC79 | 12 | |
| Unconditional Image Generation | StackedMNIST 1000-mode (test) | # Modes1.00e+3 | 11 | |
| Anomaly Detection | MNIST Heldout Digit 9 1 (test) | AUPRC34.2 | 7 | |
| Anomaly Detection | MNIST Heldout Digit 1 (test) | AUPRC28.1 | 7 | |
| Anomaly Detection | MNIST Heldout Digit 4 1 (test) | AUPRC0.401 | 7 | |
| Anomaly Detection | MNIST Heldout Digit 5 1 (test) | AUPRC0.402 | 7 | |
| Anomaly Detection | MNIST Heldout Digit 7 1 (test) | AUPRC0.29 | 7 | |
| Anomaly Detection | MNIST | AUPRC (Digit 1)0.281 | 7 |