Share your thoughts, 1 month free Claude Pro on usSee more
WorkDL logo mark

Nonlinear Information Bottleneck

About

Information bottleneck (IB) is a technique for extracting information in one random variable $X$ that is relevant for predicting another random variable $Y$. IB works by encoding $X$ in a compressed "bottleneck" random variable $M$ from which $Y$ can be accurately decoded. However, finding the optimal bottleneck variable involves a difficult optimization problem, which until recently has been considered for only two limited cases: discrete $X$ and $Y$ with small state spaces, and continuous $X$ and $Y$ with a Gaussian joint distribution (in which case optimal encoding and decoding maps are linear). We propose a method for performing IB on arbitrarily-distributed discrete and/or continuous $X$ and $Y$, while allowing for nonlinear encoding and decoding maps. Our approach relies on a novel non-parametric upper bound for mutual information. We describe how to implement our method using neural networks. We then show that it achieves better performance than the recently-proposed "variational IB" method on several real-world datasets.

Artemy Kolchinsky, Brendan D. Tracey, David H. Wolpert• 2017

Related benchmarks

TaskDatasetResultRank
Image ClassificationMNIST (test)
Test Accuracy97.2
189
Image ClassificationCINIC-10 (test)--
177
Image ClassificationCIFAR-10-C (test)--
61
Image ClassificationMNIST (train)
Train Accuracy75.2
53
Image ClassificationANIMAL-10N
Accuracy0.7562
43
Image ClassificationCIFAR-100-N--
41
Image ClassificationCIFAR-100 Sym-20% (test)
Accuracy55.99
33
Image ClassificationCIFAR-100 Sym-50% (test)
Accuracy46.2
32
Image ClassificationCIFAR-10 40% asymmetric noise
Accuracy78.16
27
Image ClassificationCIFAR-10.1 (test)
Test Error14.6
13
Showing 10 of 18 rows

Other info

Follow for update