Our new X account is live! Follow @wizwand_team for updates
WorkDL logo mark

Maximum Entropy Generators for Energy-Based Models

About

Maximum likelihood estimation of energy-based models is a challenging problem due to the intractability of the log-likelihood gradient. In this work, we propose learning both the energy function and an amortized approximate sampling mechanism using a neural generator network, which provides an efficient approximation of the log-likelihood gradient. The resulting objective requires maximizing entropy of the generated samples, which we perform using recently proposed nonparametric mutual information estimators. Finally, to stabilize the resulting adversarial game, we use a zero-centered gradient penalty derived as a necessary condition from the score matching literature. The proposed technique can generate sharp images with Inception and FID scores competitive with recent GAN techniques, does not suffer from mode collapse, and is competitive with state-of-the-art anomaly detection techniques.

Rithesh Kumar, Sherjil Ozair, Anirudh Goyal, Aaron Courville, Yoshua Bengio• 2019

Related benchmarks

TaskDatasetResultRank
Image GenerationStacked MNIST
Modes1.00e+3
32
Image GenerationCIFAR-10
FID34.55
12
Out-of-Distribution DetectionCIFAR10 vs. SVHN
AUROC79
12
Unconditional Image GenerationStackedMNIST 1000-mode (test)
# Modes1.00e+3
11
Anomaly DetectionMNIST Heldout Digit 9 1 (test)
AUPRC34.2
7
Anomaly DetectionMNIST Heldout Digit 1 (test)
AUPRC28.1
7
Anomaly DetectionMNIST Heldout Digit 4 1 (test)
AUPRC0.401
7
Anomaly DetectionMNIST Heldout Digit 5 1 (test)
AUPRC0.402
7
Anomaly DetectionMNIST Heldout Digit 7 1 (test)
AUPRC0.29
7
Anomaly DetectionMNIST
AUPRC (Digit 1)0.281
7
Showing 10 of 14 rows

Other info

Follow for update