Our new X account is live! Follow @wizwand_team for updates
WorkDL logo mark

MiCE: Mixture of Contrastive Experts for Unsupervised Image Clustering

About

We present Mixture of Contrastive Experts (MiCE), a unified probabilistic clustering framework that simultaneously exploits the discriminative representations learned by contrastive learning and the semantic structures captured by a latent mixture model. Motivated by the mixture of experts, MiCE employs a gating function to partition an unlabeled dataset into subsets according to the latent semantics and multiple experts to discriminate distinct subsets of instances assigned to them in a contrastive learning manner. To solve the nontrivial inference and learning problems caused by the latent variables, we further develop a scalable variant of the Expectation-Maximization (EM) algorithm for MiCE and provide proof of the convergence. Empirically, we evaluate the clustering performance of MiCE on four widely adopted natural image datasets. MiCE achieves significantly better results than various previous methods and a strong contrastive learning baseline.

Tsung Wei Tsai, Chongxuan Li, Jun Zhu• 2021

Related benchmarks

TaskDatasetResultRank
Image ClusteringCIFAR-10
NMI0.737
243
Image ClusteringSTL-10
ACC75.2
229
ClusteringCIFAR-10 (test)
Accuracy83.5
184
ClusteringSTL-10 (test)
Accuracy75.2
146
ClusteringCIFAR-100 (test)
ACC44
110
Image ClusteringCIFAR-100
ACC44
101
ClusteringCIFAR100 20
ACC44
93
Image ClusteringImagenet dog-15
NMI42.3
90
GroupingImagenet Dogs
ACC43.9
59
ClusteringImagenet Dogs
NMI4.23e+3
46
Showing 10 of 12 rows

Other info

Code

Follow for update