Share your thoughts, 1 month free Claude Pro on usSee more
WorkDL logo mark

Emerging Convolutions for Generative Normalizing Flows

About

Generative flows are attractive because they admit exact likelihood optimization and efficient image synthesis. Recently, Kingma & Dhariwal (2018) demonstrated with Glow that generative flows are capable of generating high quality images. We generalize the 1 x 1 convolutions proposed in Glow to invertible d x d convolutions, which are more flexible since they operate on both channel and spatial axes. We propose two methods to produce invertible convolutions that have receptive fields identical to standard convolutions: Emerging convolutions are obtained by chaining specific autoregressive convolutions, and periodic convolutions are decoupled in the frequency domain. Our experiments show that the flexibility of d x d convolutions significantly improves the performance of generative flow models on galaxy images, CIFAR10 and ImageNet.

Emiel Hoogeboom, Rianne van den Berg, Max Welling• 2019

Related benchmarks

TaskDatasetResultRank
Density EstimationCIFAR-10 (test)
Bits/dim3.34
134
Density EstimationImageNet 32x32 (test)
Bits per Sub-pixel4.09
66
Density EstimationImageNet 64x64 (test)
Bits Per Sub-Pixel3.81
62
Unconditional Image GenerationCIFAR10
BPD3.34
33
Unconditional Image GenerationImageNet-32
BPD4.09
31
Unconditional Image GenerationImageNet 64
BPD3.81
22
Generative ModelingImageNet 64x64 downsampled
Bits Per Dimension3.81
13
Image GenerationMNIST (test)--
13
Generative ModelingMNIST--
10
Generative ModelingImageNet 32x32
BPD4.09
4
Showing 10 of 11 rows

Other info

Follow for update