Share your thoughts, 1 month free Claude Pro on usSee more
WorkDL logo mark

Differentiable Unsupervised Feature Selection based on a Gated Laplacian

About

Scientific observations may consist of a large number of variables (features). Identifying a subset of meaningful features is often ignored in unsupervised learning, despite its potential for unraveling clear patterns hidden in the ambient space. In this paper, we present a method for unsupervised feature selection, and we demonstrate its use for the task of clustering. We propose a differentiable loss function that combines the Laplacian score, which favors low-frequency features, with a gating mechanism for feature selection. We improve the Laplacian score, by replacing it with a gated variant computed on a subset of features. This subset is obtained using a continuous approximation of Bernoulli variables whose parameters are trained to gate the full feature space. We mathematically motivate the proposed approach and demonstrate that in the high noise regime, it is crucial to compute the Laplacian on the gated inputs, rather than on the full feature set. Experimental demonstration of the efficacy of the proposed approach and its advantage over current baselines is provided using several real-world examples.

Ofir Lindenbaum, Uri Shaham, Jonathan Svirsky, Erez Peterfreund, Yuval Kluger• 2020

Related benchmarks

TaskDatasetResultRank
ClassificationCOIL-20
Accuracy1
76
ClusteringCOIL-20
ACC59
47
ClassificationLung
ACC87
46
ClassificationGLIOMA
Accuracy66
46
ClusteringYale
Accuracy55
37
ClassificationProstate
Accuracy81
32
ClassificationYale
Accuracy72
28
ClassificationwarpPIE 10P
Accuracy96
26
ClassificationPCMAC
Accuracy77
26
ClassificationSMK
Accuracy66
26
Showing 10 of 48 rows

Other info

Follow for update