Share your thoughts, 1 month free Claude Pro on usSee more
WorkDL logo mark

Nearest-Neighbor Density Estimation for Dependency Suppression

About

The ability to remove unwanted dependencies from data is crucial in various domains, including fairness, robust learning, and privacy protection. In this work, we propose an encoder-based approach that learns a representation independent of a sensitive variable but otherwise preserving essential data characteristics. Unlike existing methods that rely on decorrelation or adversarial learning, our approach explicitly estimates and modifies the data distribution to neutralize statistical dependencies. To achieve this, we combine a specialized variational autoencoder with a novel loss function driven by non-parametric nearest-neighbor density estimation, enabling direct optimization of independence. We evaluate our approach on multiple datasets, demonstrating that it can outperform existing unsupervised techniques and even rival supervised methods in balancing information removal and utility.

Kathleen Anderson, Thomas Martinetz• 2026

Related benchmarks

TaskDatasetResultRank
ClassificationCheXpert (val)--
13
Attribute ClassificationCheXpert original (test)
Devices Accuracy66.1
7
Attribute ClassificationMNIST original (test)
Background Accuracy51.8
7
Attribute PredictionFFHQ original (test)
Gender Accuracy58.2
7
Showing 4 of 4 rows

Other info

Follow for update