Our new X account is live! Follow @wizwand_team for updates
WorkDL logo mark

Twin Contrastive Learning for Online Clustering

About

This paper proposes to perform online clustering by conducting twin contrastive learning (TCL) at the instance and cluster level. Specifically, we find that when the data is projected into a feature space with a dimensionality of the target cluster number, the rows and columns of its feature matrix correspond to the instance and cluster representation, respectively. Based on the observation, for a given dataset, the proposed TCL first constructs positive and negative pairs through data augmentations. Thereafter, in the row and column space of the feature matrix, instance- and cluster-level contrastive learning are respectively conducted by pulling together positive pairs while pushing apart the negatives. To alleviate the influence of intrinsic false-negative pairs and rectify cluster assignments, we adopt a confidence-based criterion to select pseudo-labels for boosting both the instance- and cluster-level contrastive learning. As a result, the clustering performance is further improved. Besides the elegant idea of twin contrastive learning, another advantage of TCL is that it could independently predict the cluster assignment for each instance, thus effortlessly fitting online scenarios. Extensive experiments on six widely-used image and text benchmarks demonstrate the effectiveness of TCL. The code will be released on GitHub.

Yunfan Li, Mouxing Yang, Dezhong Peng, Taihao Li, Jiantao Huang, Xi Peng• 2022

Related benchmarks

TaskDatasetResultRank
Image ClusteringCIFAR-10
NMI0.819
243
Image ClusteringSTL-10
ACC88.8
229
ClusteringCIFAR-10 (test)
Accuracy88.7
184
Image ClusteringImageNet-10
NMI0.875
166
ClusteringCIFAR-100 (test)
ACC53.1
110
Image ClusteringCIFAR-100
ACC53.1
101
ClusteringCIFAR100 20
ACC53.1
93
GroupingImagenet Dogs
ACC64.4
59
ClusteringImagenet Dogs
NMI62.3
46
ClusteringStackOverflow
NMI78.8
13
Showing 10 of 13 rows

Other info

Follow for update