Share your thoughts, 1 month free Claude Pro on usSee more
WorkDL logo mark

Implicit Generation and Generalization in Energy-Based Models

About

Energy based models (EBMs) are appealing due to their generality and simplicity in likelihood modeling, but have been traditionally difficult to train. We present techniques to scale MCMC based EBM training on continuous neural networks, and we show its success on the high-dimensional data domains of ImageNet32x32, ImageNet128x128, CIFAR-10, and robotic hand trajectories, achieving better samples than other likelihood models and nearing the performance of contemporary GAN approaches, while covering all modes of the data. We highlight some unique capabilities of implicit generation such as compositionality and corrupt image reconstruction and inpainting. Finally, we show that EBMs are useful models across a wide variety of tasks, achieving state-of-the-art out-of-distribution classification, adversarially robust classification, state-of-the-art continual online class learning, and coherent long term predicted trajectory rollouts.

Yilun Du, Igor Mordatch• 2019

Related benchmarks

TaskDatasetResultRank
Image ClassificationCIFAR10 (test)
Accuracy49.1
585
Image GenerationCIFAR-10 (test)
FID37.9
483
Image GenerationCIFAR-10
Inception Score6.78
178
Out-of-Distribution DetectionTextures
AUROC0.48
168
Unconditional Image GenerationCIFAR-10 unconditional
FID40.58
165
Out-of-Distribution DetectionCIFAR100 (test)
AUROC54
57
Out-of-Distribution DetectionSVHN (test)
AUROC0.63
48
Out-of-Distribution DetectionCelebA (test)
AUROC70
36
Out-of-Distribution DetectionCIFAR-10 Interp.
AUROC0.7
35
Image GenerationCIFAR-10
FID40.6
25
Showing 10 of 14 rows

Other info

Follow for update