Our new X account is live! Follow @wizwand_team for updates
WorkDL logo mark

An Information-theoretic Perspective of Hierarchical Clustering

About

A combinatorial cost function for hierarchical clustering was introduced by Dasgupta \cite{dasgupta2016cost}. It has been generalized by Cohen-Addad et al. \cite{cohen2019hierarchical} to a general form named admissible function. In this paper, we investigate hierarchical clustering from the \emph{information-theoretic} perspective and formulate a new objective function. We also establish the relationship between these two perspectives. In algorithmic aspect, we get rid of the traditional top-down and bottom-up frameworks, and propose a new one to stratify the \emph{sparsest} level of a cluster tree recursively in guide with our objective function. For practical use, our resulting cluster tree is not binary. Our algorithm called HCSE outputs a $k$-level cluster tree by a novel and interpretable mechanism to choose $k$ automatically without any hyper-parameter. Our experimental results on synthetic datasets show that HCSE has a great advantage in finding the intrinsic number of hierarchies, and the results on real datasets show that HCSE also achieves competitive costs over the popular algorithms LOUVAIN and HLP.

Yicheng Pan, Feng Zheng, Bingchen Fan• 2021

Related benchmarks

TaskDatasetResultRank
Hierarchical Agglomerative ClusteringWine
Dendrogram Purity0.711
26
Hierarchical Agglomerative ClusteringIris
Dendrogram Purity0.897
20
Hierarchical Agglomerative ClusteringDigits
Dendrogram Purity0.815
20
Hierarchical Clusteringzoo
DP0.973
10
Hierarchical ClusteringOpticalDigits
Dasgupta Cost3.11e+5
10
Hierarchical ClusteringSpambase
Dasgupta's Cost4.53e+7
10
Hierarchical ClusteringBr. Cancer
DP Score94.2
10
Hierarchical ClusteringSpambase
DP55.2
10
Hierarchical Clusteringpendigits
DP76.9
9
Showing 9 of 9 rows

Other info

Follow for update