Our new X account is live! Follow @wizwand_team for updates
WorkDL logo mark

Interaction-aware Representation Modeling with Co-occurrence Consistency for Egocentric Hand-Object Parsing

About

A fine-grained understanding of egocentric human-environment interactions is crucial for developing next-generation embodied agents. One fundamental challenge in this area involves accurately parsing hands and active objects. While transformer-based architectures have demonstrated considerable potential for such tasks, several key limitations remain unaddressed: 1) existing query initialization mechanisms rely primarily on semantic cues or learnable parameters, demonstrating limited adaptability to changing active objects across varying input scenes; 2) previous transformer-based methods utilize pixel-level semantic features to iteratively refine queries during mask generation, which may introduce interaction-irrelevant content into the final embeddings; and 3) prevailing models are susceptible to "interaction illusion", producing physically inconsistent predictions. To address these issues, we propose an end-to-end Interaction-aware Transformer (InterFormer), which integrates three key components, i.e., a Dynamic Query Generator (DQG), a Dual-context Feature Selector (DFS), and the Conditional Co-occurrence (CoCo) loss. The DQG explicitly grounds query initialization in the spatial dynamics of hand-object contact, enabling targeted generation of interaction-aware queries for hands and various active objects. The DFS fuses coarse interactive cues with semantic features, thereby suppressing interaction-irrelevant noise and emphasizing the learning of interactive relationships. The CoCo loss incorporates hand-object relationship constraints to enhance physical consistency in prediction. Our model achieves state-of-the-art performance on both the EgoHOS and the challenging out-of-distribution mini-HOI4D datasets, demonstrating its effectiveness and strong generalization ability. Code and models are publicly available at https://github.com/yuggiehk/InterFormer.

Yuejiao Su, Yi Wang, Lei Yao, Yawen Cui, Lap-Pui Chau• 2026

Related benchmarks

TaskDatasetResultRank
Semantic segmentationEgoHOS in-domain (test)
Left Hand IoU92.51
13
Egocentric Hand-Object SegmentationEgoHOS out-of-domain (test)
Left Hand IoU94.38
11
Egocentric Hand-Object Segmentationmini-HOI4D out-of-distribution (test)
IoU (Left Hand)66.44
11
Hand-object segmentationEgoHOS out-of-domain (test)
Left Hand Accuracy0.9721
10
Hand-object segmentationHOI4D mini
Left Hand Accuracy96.55
10
Showing 5 of 5 rows

Other info

Follow for update