Share your thoughts, 1 month free Claude Pro on usSee more
WorkDL logo mark

Unsupervised Point Cloud Pre-Training via Contrasting and Clustering

About

Annotating large-scale point clouds is highly time-consuming and often infeasible for many complex real-world tasks. Point cloud pre-training has therefore become a promising strategy for learning discriminative representations without labeled data. In this paper, we propose a general unsupervised pre-training framework, termed ConClu, which jointly integrates contrasting and clustering. The contrasting objective maximizes the similarity between feature representations extracted from two augmented views of the same point cloud, while the clustering objective simultaneously partitions the data and enforces consistency between cluster assignments across augmentations. Experimental results on multiple downstream tasks show that our method outperforms state-of-the-art approaches, demonstrating the effectiveness of the proposed framework. Code is available at https://github.com/gfmei/conclu.

Guofeng Mei, Xiaoshui Huang, Juan Liu, Jian Zhang, Qiang Wu• 2022

Related benchmarks

TaskDatasetResultRank
Part SegmentationShapeNetPart--
246
Object ClassificationModelNet10 (test)
Accuracy95
60
Object ClassificationModelNet40 1.0 (test)
Accuracy91.6
19
Showing 3 of 3 rows

Other info

Follow for update