Our new X account is live! Follow @wizwand_team for updates
WorkDL logo mark

Flow Contrastive Estimation of Energy-Based Models

About

This paper studies a training method to jointly estimate an energy-based model and a flow-based model, in which the two models are iteratively updated based on a shared adversarial value function. This joint training method has the following traits. (1) The update of the energy-based model is based on noise contrastive estimation, with the flow model serving as a strong noise distribution. (2) The update of the flow model approximately minimizes the Jensen-Shannon divergence between the flow model and the data distribution. (3) Unlike generative adversarial networks (GAN) which estimates an implicit probability distribution defined by a generator model, our method estimates two explicit probabilistic distributions on the data. Using the proposed method we demonstrate a significant improvement on the synthesis quality of the flow model, and show the effectiveness of unsupervised feature learning by the learned energy-based model. Furthermore, the proposed training method can be easily adapted to semi-supervised learning. We achieve competitive results to the state-of-the-art semi-supervised learning methods.

Ruiqi Gao, Erik Nijkamp, Diederik P. Kingma, Zhen Xu, Andrew M. Dai, Ying Nian Wu• 2019

Related benchmarks

TaskDatasetResultRank
Image GenerationCelebA 64 x 64 (test)
FID12.21
203
ClassificationSVHN (test)
Error Rate3.87
182
Image GenerationCIFAR-10--
178
Unconditional Image GenerationCIFAR-10
FID37.3
171
Unconditional Image GenerationCIFAR-10 unconditional
FID37.3
159
Generative ModelingCIFAR-10 (test)
NLL (bits/dim)3.27
62
Image GenerationSVHN
FID20.19
20
Image GenerationCelebA 32x32 (test)
FID12.21
17
Unconditional image synthesisCIFAR-10 32x32 (test)
FID37.3
12
Generative ModelingSVHN (test)
Bits Per Dimension2.15
3
Showing 10 of 11 rows

Other info

Code

Follow for update