Share your thoughts, 1 month free Claude Pro on usSee more
WorkDL logo mark

Hamba: Single-view 3D Hand Reconstruction with Graph-guided Bi-Scanning Mamba

About

3D Hand reconstruction from a single RGB image is challenging due to the articulated motion, self-occlusion, and interaction with objects. Existing SOTA methods employ attention-based transformers to learn the 3D hand pose and shape, yet they do not fully achieve robust and accurate performance, primarily due to inefficiently modeling spatial relations between joints. To address this problem, we propose a novel graph-guided Mamba framework, named Hamba, which bridges graph learning and state space modeling. Our core idea is to reformulate Mamba's scanning into graph-guided bidirectional scanning for 3D reconstruction using a few effective tokens. This enables us to efficiently learn the spatial relationships between joints for improving reconstruction performance. Specifically, we design a Graph-guided State Space (GSS) block that learns the graph-structured relations and spatial sequences of joints and uses 88.5% fewer tokens than attention-based methods. Additionally, we integrate the state space features and the global features using a fusion module. By utilizing the GSS block and the fusion module, Hamba effectively leverages the graph-guided state space features and jointly considers global and local features to improve performance. Experiments on several benchmarks and in-the-wild tests demonstrate that Hamba significantly outperforms existing SOTAs, achieving the PA-MPVPE of 5.3mm and F@15mm of 0.992 on FreiHAND. At the time of this paper's acceptance, Hamba holds the top position, Rank 1 in two Competition Leaderboards on 3D hand reconstruction. Project Website: https://humansensinglab.github.io/Hamba/

Haoye Dong, Aviral Chharia, Wenbo Gou, Francisco Vicente Carrasco, Fernando De la Torre• 2024

Related benchmarks

TaskDatasetResultRank
3D Hand ReconstructionFreiHAND (test)
F@15mm99.2
154
Human Mesh Recovery3DPW
PA-MPJPE54.7
140
3D Hand Pose EstimationNYU (test)
Mean Error (mm)8.38
100
Hand Mesh ReconstructionHO3D v2 (test)
F@50.648
44
3D Hand Pose EstimationFreiHAND
PA-MPJPE (mm)5.7
36
3D Hand ReconstructionFreiHAND
PA MPVPE5.5
25
3D Hand Pose EstimationHO-3D v2
PA-MPJPE (mm)7.5
25
3D Hand-Object InteractionHO3D v2 (test)
PA-MPJPE7.5
20
3D Hand ReconstructionHO3D v3
PA-MPJPE6.9
18
3D Hand-Object ReconstructionHO3D v2
MPJPE7.5
16
Showing 10 of 24 rows

Other info

Code

Follow for update