Deep Subspace Clustering Networks
About
We present a novel deep neural network architecture for unsupervised subspace clustering. This architecture is built upon deep auto-encoders, which non-linearly map the input data into a latent space. Our key idea is to introduce a novel self-expressive layer between the encoder and the decoder to mimic the "self-expressiveness" property that has proven effective in traditional subspace clustering. Being differentiable, our new self-expressive layer provides a simple but effective way to learn pairwise affinities between all data points through a standard back-propagation procedure. Being nonlinear, our neural-network based method is able to cluster data points having complex (often nonlinear) structures. We further propose pre-training and fine-tuning strategies that let us effectively learn the parameters of our subspace clustering networks. Our experiments show that the proposed method significantly outperforms the state-of-the-art unsupervised subspace clustering methods.
Related benchmarks
| Task | Dataset | Result | Rank | |
|---|---|---|---|---|
| Subspace Clustering | HSI-Pavia 10 classes | Clustering Accuracy74.92 | 78 | |
| Clustering | MNIST | Clustering Accuracy25.48 | 60 | |
| Subspace Clustering | Extended Yale-B | Clustering Accuracy97.3 | 29 | |
| Subspace Clustering | Extended Yale B 10 Subjects | Clustering Error1.25 | 26 | |
| Subspace Clustering | Extended Yale B 10 Subjects | Mean Clustering Error1.59 | 22 | |
| Subspace Clustering | Yale-B | ACC97.1 | 21 | |
| Subspace Clustering | ORL | NMI91.5 | 19 | |
| Multi-view Subspace Clustering | Still DB | NMI21.6 | 18 | |
| Multi-view Subspace Clustering | Yale | NMI73.8 | 18 | |
| Multi-view Subspace Clustering | ORL | NMI88.3 | 18 |