HaWoR: World-Space Hand Motion Reconstruction from Egocentric Videos
About
Despite the advent in 3D hand pose estimation, current methods predominantly focus on single-image 3D hand reconstruction in the camera frame, overlooking the world-space motion of the hands. Such limitation prohibits their direct use in egocentric video settings, where hands and camera are continuously in motion. In this work, we propose HaWoR, a high-fidelity method for hand motion reconstruction in world coordinates from egocentric videos. We propose to decouple the task by reconstructing the hand motion in the camera space and estimating the camera trajectory in the world coordinate system. To achieve precise camera trajectory estimation, we propose an adaptive egocentric SLAM framework that addresses the shortcomings of traditional SLAM methods, providing robust performance under challenging camera dynamics. To ensure robust hand motion trajectories, even when the hands move out of view frustum, we devise a novel motion infiller network that effectively completes the missing frames of the sequence. Through extensive quantitative and qualitative evaluations, we demonstrate that HaWoR achieves state-of-the-art performance on both hand motion reconstruction and world-frame camera trajectory estimation under different egocentric benchmark datasets. Code and models are available on https://hawor-project.github.io/ .
Related benchmarks
| Task | Dataset | Result | Rank | |
|---|---|---|---|---|
| 3D Hand Reconstruction | DexYCB (test) | -- | 28 | |
| 3D Hand Mesh Reconstruction | ARCTIC General Setting | P-MPJPE6.2 | 5 | |
| 3D Hand Mesh Reconstruction | ARCTIC Bimanual Setting | P-MPVPE6 | 5 | |
| HOI Reconstruction | HOI4D (test) | -- | 5 | |
| Global hand trajectory and reconstruction | HOI4D Short Video (test) | WA-MPJPE22.54 | 4 | |
| Global hand trajectory and reconstruction | HOI4D Long Video (test) | WA-MPJPE27.4 | 4 |