Our new X account is live! Follow @wizwand_team for updates
WorkDL logo mark

Reconstructing Hands in 3D with Transformers

About

We present an approach that can reconstruct hands in 3D from monocular input. Our approach for Hand Mesh Recovery, HaMeR, follows a fully transformer-based architecture and can analyze hands with significantly increased accuracy and robustness compared to previous work. The key to HaMeR's success lies in scaling up both the data used for training and the capacity of the deep network for hand reconstruction. For training data, we combine multiple datasets that contain 2D or 3D hand annotations. For the deep model, we use a large scale Vision Transformer architecture. Our final model consistently outperforms the previous baselines on popular 3D hand pose benchmarks. To further evaluate the effect of our design in non-controlled settings, we annotate existing in-the-wild datasets with 2D hand keypoint annotations. On this newly collected dataset of annotations, HInt, we demonstrate significant improvements over existing baselines. We make our code, data and models available on the project website: https://geopavlakos.github.io/hamer/.

Georgios Pavlakos, Dandan Shan, Ilija Radosavovic, Angjoo Kanazawa, David Fouhey, Jitendra Malik• 2023

Related benchmarks

TaskDatasetResultRank
3D Hand ReconstructionFreiHAND (test)
F@15mm99
148
Hand Mesh ReconstructionHO3D v2 (test)
F@50.635
34
Hand Pose EstimationHInt New Days v1 (test)
PCK @ 0.0560.8
32
Hand Pose EstimationHInt - VISOR v1 (test)
PCK @ 0.0556.6
32
Hand Pose EstimationEgo4D HInt v1 (test)
PCK @ 0.0552
32
3D Hand-Object InteractionHO3D v2 (test)
PA-MPJPE7.6
20
3D Hand Pose EstimationH2O
MPJPE Right23.82
14
Hand Pose EstimationEgoExo4D 1.0 (test)
PA-MPJPE (mm)13.04
13
Occluded Hand Joint ReconstructionHInt Benchmark v1 (test)
NewDays PCK@0.0527.2
11
3D Hand Mesh ReconstructionHInt NewDays Occluded Joints (test)
PCK@0.0528.9
8
Showing 10 of 40 rows

Other info

Code

Follow for update