Our new X account is live! Follow @wizwand_team for updates
WorkDL logo mark

Deep Subspace Clustering Networks

About

We present a novel deep neural network architecture for unsupervised subspace clustering. This architecture is built upon deep auto-encoders, which non-linearly map the input data into a latent space. Our key idea is to introduce a novel self-expressive layer between the encoder and the decoder to mimic the "self-expressiveness" property that has proven effective in traditional subspace clustering. Being differentiable, our new self-expressive layer provides a simple but effective way to learn pairwise affinities between all data points through a standard back-propagation procedure. Being nonlinear, our neural-network based method is able to cluster data points having complex (often nonlinear) structures. We further propose pre-training and fine-tuning strategies that let us effectively learn the parameters of our subspace clustering networks. Our experiments show that the proposed method significantly outperforms the state-of-the-art unsupervised subspace clustering methods.

Pan Ji, Tong Zhang, Hongdong Li, Mathieu Salzmann, Ian Reid• 2017

Related benchmarks

TaskDatasetResultRank
Subspace ClusteringHSI-Pavia 10 classes
Clustering Accuracy74.92
78
ClusteringMNIST
Clustering Accuracy25.48
60
Subspace ClusteringExtended Yale-B
Clustering Accuracy97.3
29
Subspace ClusteringExtended Yale B 10 Subjects
Clustering Error1.25
26
Subspace ClusteringExtended Yale B 10 Subjects
Mean Clustering Error1.59
22
Subspace ClusteringYale-B
ACC97.1
21
Subspace ClusteringORL
NMI91.5
19
Multi-view Subspace ClusteringStill DB
NMI21.6
18
Multi-view Subspace ClusteringYale
NMI73.8
18
Multi-view Subspace ClusteringORL
NMI88.3
18
Showing 10 of 25 rows

Other info

Code

Follow for update