Our new X account is live! Follow @wizwand_team for updates
WorkDL logo mark

Gaussianization Flows

About

Iterative Gaussianization is a fixed-point iteration procedure that can transform any continuous random vector into a Gaussian one. Based on iterative Gaussianization, we propose a new type of normalizing flow model that enables both efficient computation of likelihoods and efficient inversion for sample generation. We demonstrate that these models, named Gaussianization flows, are universal approximators for continuous probability distributions under some regularity conditions. Because of this guaranteed expressivity, they can capture multimodal target distributions without compromising the efficiency of sample generation. Experimentally, we show that Gaussianization flows achieve better or comparable performance on several tabular datasets compared to other efficiently invertible flow models such as Real NVP, Glow and FFJORD. In particular, Gaussianization flows are easier to initialize, demonstrate better robustness with respect to different transformations of the training data, and generalize better on small training sets.

Chenlin Meng, Yang Song, Jiaming Song, Stefano Ermon• 2020

Related benchmarks

TaskDatasetResultRank
Density EstimationMNIST (test)
NLL (bits/dim)1.29
56
Unconditional Density EstimationPOWER (test)--
30
Density EstimationFashion (test)
NLL (bits/dim)3.35
27
Density EstimationBSDS300 (test)
NLL (nats)-152.8
25
Unconditional Density EstimationMINIBOONE (test)
NLL (nats)10.32
22
Unconditional Density EstimationHEPMASS (test)
NLL (nats)17.59
22
Density EstimationGAS (test)--
10
Showing 7 of 7 rows

Other info

Follow for update