Share your thoughts, 1 month free Claude Pro on usSee more
WorkDL logo mark

JEM++: Improved Techniques for Training JEM

About

Joint Energy-based Model (JEM) is a recently proposed hybrid model that retains strong discriminative power of modern CNN classifiers, while generating samples rivaling the quality of GAN-based approaches. In this paper, we propose a variety of new training procedures and architecture features to improve JEM's accuracy, training stability, and speed altogether. 1) We propose a proximal SGLD to generate samples in the proximity of samples from the previous step, which improves the stability. 2) We further treat the approximate maximum likelihood learning of EBM as a multi-step differential game, and extend the YOPO framework to cut out redundant calculations during backpropagation, which accelerates the training substantially. 3) Rather than initializing SGLD chain from random noise, we introduce a new informative initialization that samples from a distribution estimated from training data. 4) This informative initialization allows us to enable batch normalization in JEM, which further releases the power of modern CNN architectures for hybrid modeling. Code: https://github.com/sndnyang/JEMPP

Xiulong Yang, Shihao Ji• 2021

Related benchmarks

TaskDatasetResultRank
Image ClassificationCIFAR10 (test)
Accuracy94.1
585
Image GenerationCIFAR-10 (test)
FID37.1
483
Out-of-Distribution DetectionCIFAR100 (test)
AUROC88
57
Out-of-Distribution DetectionSVHN (test)
AUROC0.94
48
Out-of-Distribution DetectionCelebA (test)
AUROC90
36
Out-of-Distribution DetectionCIFAR-10 Interp.
AUROC0.77
35
Image GenerationCIFAR-100 (test)
IS10.07
35
Generative ModelingCIFAR-10
FID38
27
ClassificationCIFAR-10
Accuracy94.1
15
Image ClassificationCIFAR-10 (test)
Accuracy93.73
8
Showing 10 of 10 rows

Other info

Follow for update