Our new X account is live! Follow @wizwand_team for updates
WorkDL logo mark

Flow++: Improving Flow-Based Generative Models with Variational Dequantization and Architecture Design

About

Flow-based generative models are powerful exact likelihood models with efficient sampling and inference. Despite their computational efficiency, flow-based models generally have much worse density modeling performance compared to state-of-the-art autoregressive models. In this paper, we investigate and improve upon three limiting design choices employed by flow-based models in prior work: the use of uniform noise for dequantization, the use of inexpressive affine flows, and the use of purely convolutional conditioning networks in coupling layers. Based on our findings, we propose Flow++, a new flow-based model that is now the state-of-the-art non-autoregressive model for unconditional density estimation on standard image benchmarks. Our work has begun to close the significant performance gap that has so far existed between autoregressive models and flow-based models. Our implementation is available at https://github.com/aravindsrinivas/flowpp

Jonathan Ho, Xi Chen, Aravind Srinivas, Yan Duan, Pieter Abbeel• 2019

Related benchmarks

TaskDatasetResultRank
Image GenerationCIFAR-10 (test)
FID46.9
471
Unconditional Image GenerationCIFAR-10 (test)--
216
Density EstimationCIFAR-10 (test)
Bits/dim3.08
134
Density EstimationImageNet 32x32 (test)
Bits per Sub-pixel3.86
66
Generative ModelingCIFAR-10 (test)
NLL (bits/dim)3.08
62
Density EstimationImageNet 64x64 (test)
Bits Per Sub-Pixel3.69
62
Generative ModelingCIFAR-10
BPD3.08
46
Density EstimationCIFAR-10
bpd3.08
40
Unconditional Image GenerationCIFAR10
BPD3.08
33
Unconditional Image GenerationImageNet-32
BPD3.86
31
Showing 10 of 24 rows

Other info

Code

Follow for update