Share your thoughts, 1 month free Claude Pro on usSee more
WorkDL logo mark

Learning Non-Convergent Non-Persistent Short-Run MCMC Toward Energy-Based Model

About

This paper studies a curious phenomenon in learning energy-based model (EBM) using MCMC. In each learning iteration, we generate synthesized examples by running a non-convergent, non-mixing, and non-persistent short-run MCMC toward the current model, always starting from the same initial distribution such as uniform noise distribution, and always running a fixed number of MCMC steps. After generating synthesized examples, we then update the model parameters according to the maximum likelihood learning gradient, as if the synthesized examples are fair samples from the current model. We treat this non-convergent short-run MCMC as a learned generator model or a flow model. We provide arguments for treating the learned non-convergent short-run MCMC as a valid model. We show that the learned short-run MCMC is capable of generating realistic images. More interestingly, unlike traditional EBM or MCMC, the learned short-run MCMC is capable of reconstructing observed images and interpolating between images, like generator or flow models. The code can be found in the Appendix.

Erik Nijkamp, Mitch Hill, Song-Chun Zhu, Ying Nian Wu• 2019

Related benchmarks

TaskDatasetResultRank
Image GenerationCIFAR-10 (test)--
483
Image GenerationCelebA 64 x 64 (test)
FID23.02
208
Image GenerationCIFAR-10
FID44.5
203
Unconditional Image GenerationCIFAR-10 unconditional--
165
Image GenerationCelebA-64
FID23.03
75
Image GenerationSVHN (test)
FID35.32
20
Image Generation and ReconstructionCelebA (test)
FID47.95
11
Showing 7 of 7 rows

Other info

Follow for update