Our new X account is live! Follow @wizwand_team for updates
WorkDL logo mark

Omnigrasp: Grasping Diverse Objects with Simulated Humanoids

About

We present a method for controlling a simulated humanoid to grasp an object and move it to follow an object's trajectory. Due to the challenges in controlling a humanoid with dexterous hands, prior methods often use a disembodied hand and only consider vertical lifts or short trajectories. This limited scope hampers their applicability for object manipulation required for animation and simulation. To close this gap, we learn a controller that can pick up a large number (>1200) of objects and carry them to follow randomly generated trajectories. Our key insight is to leverage a humanoid motion representation that provides human-like motor skills and significantly speeds up training. Using only simplistic reward, state, and object representations, our method shows favorable scalability on diverse objects and trajectories. For training, we do not need a dataset of paired full-body motion and object trajectories. At test time, we only require the object mesh and desired trajectories for grasping and transporting. To demonstrate the capabilities of our method, we show state-of-the-art success rates in following object trajectories and generalizing to unseen objects. Code and models will be released.

Zhengyi Luo, Jinkun Cao, Sammy Christen, Alexander Winkler, Kris Kitani, Weipeng Xu• 2024

Related benchmarks

TaskDatasetResultRank
Object Grasping and Trajectory FollowingGRAB Goal-Test 1.0 (Cross-Object)
Grasp Success Rate100
6
Object Grasping and Trajectory FollowingGRAB IMOS 1.0 (Cross-Subject test)
Grasp Success Rate98.9
6
Grasping and Trajectory FollowingOakInk 1330 objects (train)
Grasp Success Rate95.6
3
Grasping and Trajectory FollowingOakInk 185 objects (test)
Grasp Success Rate94.3
3
Humanoid Object Grasping and Trajectory FollowingOMOMO 7 objects
Time To Reach100
1
Showing 5 of 5 rows

Other info

Code

Follow for update