Our new X account is live! Follow @wizwand_team for updates
WorkDL logo mark

Neural Autoregressive Flows

About

Normalizing flows and autoregressive models have been successfully combined to produce state-of-the-art results in density estimation, via Masked Autoregressive Flows (MAF), and to accelerate state-of-the-art WaveNet-based speech synthesis to 20x faster than real-time, via Inverse Autoregressive Flows (IAF). We unify and generalize these approaches, replacing the (conditionally) affine univariate transformations of MAF/IAF with a more general class of invertible univariate transformations expressed as monotonic neural networks. We demonstrate that the proposed neural autoregressive flows (NAF) are universal approximators for continuous probability distributions, and their greater expressivity allows them to better capture multimodal target distributions. Experimentally, NAF yields state-of-the-art performance on a suite of density estimation tasks and outperforms IAF in variational autoencoders trained on binarized MNIST.

Chin-Wei Huang, David Krueger, Alexandre Lacoste, Aaron Courville• 2018

Related benchmarks

TaskDatasetResultRank
Unconditional Density EstimationPOWER (test)
Average Test Log Likelihood (nats)0.62
30
Density EstimationBSDS300 (test)
NLL (nats)-157.7
25
Density EstimationGAS d=8; N=1,052,065 (test)
Avg Test Log-Likelihood11.96
25
Unconditional Density EstimationMINIBOONE (test)
NLL (nats)8.86
22
Unconditional Density EstimationHEPMASS (test)
NLL (nats)15.09
22
Density EstimationHEPMASS UCI (test)
Log-likelihood-15.09
12
Density EstimationMINIBOON (test)
Avg Log-Likelihood-8.86
10
Density EstimationGAS (test)
Average Log-Likelihood11.96
10
Showing 8 of 8 rows

Other info

Follow for update