Share your thoughts, 1 month free Claude Pro on usSee more
WorkDL logo mark

PPC-MT: Parallel Point Cloud Completion with Mamba-Transformer Hybrid Architecture

About

Existing point cloud completion methods struggle to balance high-quality reconstruction with computational efficiency. To address this, we propose PPC-MT, a novel parallel framework for point cloud completion leveraging a hybrid Mamba-Transformer architecture. Our approach introduces an innovative parallel completion strategy guided by Principal Component Analysis (PCA), which imposes a geometrically meaningful structure on unordered point clouds, transforming them into ordered sets and decomposing them into multiple subsets. These subsets are reconstructed in parallel using a multi-head reconstructor. This structured parallel synthesis paradigm significantly enhances the uniformity of point distribution and detail fidelity, while preserving computational efficiency. By integrating Mamba's linear complexity for efficient feature extraction during encoding with the Transformer's capability to model fine-grained multi-sequence relationships during decoding, PPC-MT effectively balances efficiency and reconstruction accuracy. Extensive quantitative and qualitative experiments on benchmark datasets, including PCN, ShapeNet-55/34, and KITTI, demonstrate that PPC-MT outperforms state-of-the-art methods across multiple metrics, validating the efficacy of our proposed framework.

Jie Li, Shengwei Tian, Long Yu, Xin Ning• 2026

Related benchmarks

TaskDatasetResultRank
Point Cloud CompletionPCN (test)
Average (L1 CD)6.6
67
Point Cloud CompletionShapeNet-34 (seen categories)--
58
Point Cloud CompletionShapeNet-34 unseen categories--
45
Point Cloud CompletionPCN
CD6.6
37
Point Cloud CompletionShapeNet55 (all)
CD-L20.82
10
Point Cloud CompletionKITTI
Fidelity0.0015
4
Showing 6 of 6 rows

Other info

Follow for update