Our new X account is live! Follow @wizwand_team for updates
WorkDL logo mark

Flowification: Everything is a Normalizing Flow

About

The two key characteristics of a normalizing flow is that it is invertible (in particular, dimension preserving) and that it monitors the amount by which it changes the likelihood of data points as samples are propagated along the network. Recently, multiple generalizations of normalizing flows have been introduced that relax these two conditions. On the other hand, neural networks only perform a forward pass on the input, there is neither a notion of an inverse of a neural network nor is there one of its likelihood contribution. In this paper we argue that certain neural network architectures can be enriched with a stochastic inverse pass and that their likelihood contribution can be monitored in a way that they fall under the generalized notion of a normalizing flow mentioned above. We term this enrichment flowification. We prove that neural networks only containing linear layers, convolutional layers and invertible activations such as LeakyReLU can be flowified and evaluate them in the generative setting on image datasets.

B\'alint M\'at\'e, Samuel Klein, Tobias Golling, Fran\c{c}ois Fleuret• 2022

Related benchmarks

TaskDatasetResultRank
Density EstimationCIFAR-10 (test)
Bits/dim3.69
134
Density EstimationMNIST (test)
NLL (bits/dim)1.35
56
Unconditional Density EstimationPOWER (test)
Average Test Log Likelihood (nats)-0.5
30
Density EstimationBSDS300 (test)
NLL (nats)144.2
25
Density EstimationHEPMASS UCI (test)
Log-likelihood-19.56
12
Density EstimationMINBOONE UCI (test)
Test Log Likelihood (nats)-14.05
9
Density EstimationGAS UCI (test)
Test Log Likelihood (nats)5.35
3
Showing 7 of 7 rows

Other info

Code

Follow for update