Our new X account is live! Follow @wizwand_team for updates
WorkDL logo mark

PhysBrain: Human Egocentric Data as a Bridge from Vision Language Models to Physical Intelligence

About

Robotic generalization relies on physical intelligence: the ability to reason about state changes, contact-rich interactions, and long-horizon planning under egocentric perception and action. Vision Language Models (VLMs) are essential to Vision-Language-Action (VLA) systems, but the reliance on third-person training data creates a viewpoint gap for humanoid robots. Collecting massive robot-centric data is an ideal but impractical solution due to cost and diversity constraints. Conversely, human egocentric videos offer a highly scalable data source with rich interaction context, yet the embodiment mismatch prevents the direct application. To bridge this gap, we propose an Egocentric2Embodiment Translation Pipeline that transforms raw human egocentric videos into multi-level, schema-driven embodiment supervision with enforced evidence grounding and temporal consistency, enabling the construction of the Egocentric2Embodiment dataset (E2E-3M) at scale. An egocentric-aware embodied brain, termed PhysBrain, is obtained by training on the E2E-3M dataset. PhysBrain exhibits substantially improved egocentric understanding, particularly for planning. It provides an egocentric-aware initialization that enables more sample-efficient VLA fine-tuning and higher success rates, demonstrating effective transfer from human egocentric supervision to downstream robot control.

Xiaopeng Lin, Shijie Lian, Bin Yu, Ruoqi Yang, Zhaolong Shen, Changti Wu, Yuzhuo Miao, Yurun Jin, Yukun Shi, Jiyan He, Cong Huang, Bojun Cheng, Kai Chen• 2025

Related benchmarks

TaskDatasetResultRank
Task PlanningEgoPlan-Benchmark1 (val)
Accuracy47.4
11
Task PlanningEgoPlan-Benchmark 2
Accuracy46.9
11
Egocentric UnderstandingEgoThink
Action Accuracy69
11
Robotic ManipulationRoboCasa GR1 Tabletop Manipulation (test)
PnP Bottle To Cabinet Close74
6
Showing 4 of 4 rows

Other info

Follow for update