Our new X account is live! Follow @wizwand_team for updates
WorkDL logo mark

Contrastive Hierarchical Clustering

About

Deep clustering has been dominated by flat models, which split a dataset into a predefined number of groups. Although recent methods achieve an extremely high similarity with the ground truth on popular benchmarks, the information contained in the flat partition is limited. In this paper, we introduce CoHiClust, a Contrastive Hierarchical Clustering model based on deep neural networks, which can be applied to typical image data. By employing a self-supervised learning approach, CoHiClust distills the base network into a binary tree without access to any labeled data. The hierarchical clustering structure can be used to analyze the relationship between clusters, as well as to measure the similarity between data points. Experiments demonstrate that CoHiClust generates a reasonable structure of clusters, which is consistent with our intuition and image semantics. Moreover, it obtains superior clustering accuracy on most of the image datasets compared to the state-of-the-art flat clustering models.

Micha{\l} Znale\'zniak, Przemys{\l}aw Rola, Patryk Kaszuba, Jacek Tabor, Marek \'Smieja• 2023

Related benchmarks

TaskDatasetResultRank
ClusteringCIFAR-10 (test)
Accuracy83.9
184
ClusteringSTL-10 (test)
Accuracy61.3
146
ClusteringMNIST (test)
NMI0.97
122
ClusteringCIFAR-100 (test)
ACC43.7
110
ClusteringImageNet-10 (test)
ACC95.3
69
ClusteringImageNet-Dogs (test)
NMI0.411
35
ClusteringCIFAR100 (train+test)--
6
Hierarchical ClusteringF-MNIST (test)
DP0.52
5
Hierarchical ClusteringCIFAR-10 (train+test)
DP0.715
1
Hierarchical ClusteringSTL-10 (train+test)
DP0.53
1
Showing 10 of 12 rows

Other info

Code

Follow for update