Our new X account is live! Follow @wizwand_team for updates
WorkDL logo mark

MintNet: Building Invertible Neural Networks with Masked Convolutions

About

We propose a new way of constructing invertible neural networks by combining simple building blocks with a novel set of composition rules. This leads to a rich set of invertible architectures, including those similar to ResNets. Inversion is achieved with a locally convergent iterative procedure that is parallelizable and very fast in practice. Additionally, the determinant of the Jacobian can be computed analytically and efficiently, enabling their generative use as flow models. To demonstrate their flexibility, we show that our invertible neural networks are competitive with ResNets on MNIST and CIFAR-10 classification. When trained as generative models, our invertible networks achieve competitive likelihoods on MNIST, CIFAR-10 and ImageNet 32x32, with bits per dimension of 0.98, 3.32 and 4.06 respectively.

Yang Song, Chenlin Meng, Stefano Ermon• 2019

Related benchmarks

TaskDatasetResultRank
Unconditional Image GenerationCIFAR-10 (test)--
216
Density EstimationImageNet 32x32 (test)
Bits per Sub-pixel4.06
66
Density EstimationCIFAR-10
bpd3.32
40
SamplingCIFAR-10
Sampling Time (s)117.8
39
Unconditional Image GenerationCIFAR10
BPD3.32
33
Unconditional Image GenerationImageNet-32
BPD4.06
31
Likelihood EstimationCIFAR-10 (test)
NLL (BPD)3.32
24
Density EstimationMNIST
bpd0.98
12
Generative ModelingMNIST
BPD0.98
10
SamplingImageNet 32x32
Sampling Time (s)120.8
9
Showing 10 of 12 rows

Other info

Follow for update