Share your thoughts, 1 month free Claude Pro on usSee more
WorkDL logo mark

DeepDPM: Deep Clustering With an Unknown Number of Clusters

About

Deep Learning (DL) has shown great promise in the unsupervised task of clustering. That said, while in classical (i.e., non-deep) clustering the benefits of the nonparametric approach are well known, most deep-clustering methods are parametric: namely, they require a predefined and fixed number of clusters, denoted by K. When K is unknown, however, using model-selection criteria to choose its optimal value might become computationally expensive, especially in DL as the training process would have to be repeated numerous times. In this work, we bridge this gap by introducing an effective deep-clustering method that does not require knowing the value of K as it infers it during the learning. Using a split/merge framework, a dynamic architecture that adapts to the changing K, and a novel loss, our proposed method outperforms existing nonparametric methods (both classical and deep ones). While the very few existing deep nonparametric methods lack scalability, we demonstrate ours by being the first to report the performance of such a method on ImageNet. We also demonstrate the importance of inferring K by showing how methods that fix it deteriorate in performance when their assumed K value gets further from the ground-truth one, especially on imbalanced datasets. Our code is available at https://github.com/BGU-CS-VIL/DeepDPM.

Meitar Ronen, Shahaf E. Finder, Oren Freifeld• 2022

Related benchmarks

TaskDatasetResultRank
ClusteringMNIST
NMI0.94
113
ClusteringUSPS
NMI88
104
ClusteringYTF
NMI0.93
41
ClusteringREUTERS 10K
ACC83
37
ClusteringSTL-10
ACC85
28
ClusteringDIGIT
ACC85.4
25
ClusteringCIFAR10
Running Time4.04e+3
21
ClusteringMNIST
Running Time745
18
ClusteringLetterA-J (full)
Accuracy0.612
17
ClusteringFashion
Accuracy36.52
17
Showing 10 of 29 rows

Other info

Follow for update