PanoAffordanceNet: Towards Holistic Affordance Grounding in 360{\deg} Indoor Environments
About
Global perception is essential for embodied agents in 360{\deg} spaces, yet current affordance grounding remains largely object-centric and restricted to perspective views. To bridge this gap, we introduce a novel task: Holistic Affordance Grounding in 360{\deg} Indoor Environments. This task faces unique challenges, including severe geometric distortions from Equirectangular Projection (ERP), semantic dispersion, and cross-scale alignment difficulties. We propose PanoAffordanceNet, an end-to-end framework featuring a Distortion-Aware Spectral Modulator (DASM) for latitude-dependent calibration and an Omni-Spherical Densification Head (OSDH) to restore topological continuity from sparse activations. By integrating multi-level constraints comprising pixel-wise, distributional, and region-text contrastive objectives, our framework effectively suppresses semantic drift under low supervision. Furthermore, we construct 360-AGD, the first high-quality panoramic affordance grounding dataset. Extensive experiments demonstrate that PanoAffordanceNet significantly outperforms existing methods, establishing a solid baseline for scene-level perception in embodied intelligence. The source code and benchmark dataset will be made publicly available at https://github.com/GL-ZHU925/PanoAffordanceNet.
Related benchmarks
| Task | Dataset | Result | Rank | |
|---|---|---|---|---|
| Affordance Grounding | AGD20k v1 (Seen) | KLD0.739 | 14 | |
| Affordance Grounding | AGD20k v1 (Unseen) | KLD1.185 | 14 | |
| Affordance Grounding | 360-AGD (Easy Split) | KLD1.27 | 3 | |
| Affordance Grounding | 360-AGD (Hard Split) | KLD1.306 | 3 |