Our new X account is live! Follow @wizwand_team for updates
WorkDL logo mark

TRAM: Global Trajectory and Motion of 3D Humans from in-the-wild Videos

About

We propose TRAM, a two-stage method to reconstruct a human's global trajectory and motion from in-the-wild videos. TRAM robustifies SLAM to recover the camera motion in the presence of dynamic humans and uses the scene background to derive the motion scale. Using the recovered camera as a metric-scale reference frame, we introduce a video transformer model (VIMO) to regress the kinematic body motion of a human. By composing the two motions, we achieve accurate recovery of 3D humans in the world space, reducing global motion errors by a large margin from prior work. https://yufu-wang.github.io/tram4d/

Yufu Wang, Ziyun Wang, Lingjie Liu, Kostas Daniilidis• 2024

Related benchmarks

TaskDatasetResultRank
3D Human Pose and Shape EstimationEMDB Protocol 1 24 joints
PA-MPJPE45.7
31
Human Mesh Reconstruction3DPW 14 joints (test)
PA-MPJPE35.6
26
Human Mesh ReconstructionEMDB 24 joints (test)
PA-MPJPE45.7
21
Human Mesh Recovery3DPW 14 (test)
PA-MPJPE35.6
10
Global human motion estimationEMDB 2
WA-MPJPE76.4
8
Global human motion estimationRICH
WA-MPJPE127.8
7
Human global trajectory and motion reconstructionEMDB 2
PA-MPJPE38.1
5
Human motion estimation in world coordinatesEMDB-2 24 joints (test)
WA-MPJPE76.4
4
Human Trajectory ReconstructionSLOPER4D (test)
WA-MPJPE149.5
4
Scene Geometry ReconstructionSLOPER4D (test)
Chamfer Distance10.66
3
Showing 10 of 10 rows

Other info

Code

Follow for update