Language Conditioned Spatial Relation Reasoning for 3D Object Grounding
About
Localizing objects in 3D scenes based on natural language requires understanding and reasoning about spatial relations. In particular, it is often crucial to distinguish similar objects referred by the text, such as "the left most chair" and "a chair next to the window". In this work we propose a language-conditioned transformer model for grounding 3D objects and their spatial relations. To this end, we design a spatial self-attention layer that accounts for relative distances and orientations between objects in input 3D point clouds. Training such a layer with visual and language inputs enables to disambiguate spatial relations and to localize objects referred by the text. To facilitate the cross-modal learning of relations, we further propose a teacher-student approach where the teacher model is first trained using ground-truth object labels, and then helps to train a student model using point cloud inputs. We perform ablation studies showing advantages of our approach. We also demonstrate our model to significantly outperform the state of the art on the challenging Nr3D, Sr3D and ScanRefer 3D object grounding datasets.
Related benchmarks
| Task | Dataset | Result | Rank | |
|---|---|---|---|---|
| 3D Visual Grounding | ScanRefer (val) | Overall Accuracy @ IoU 0.5037.73 | 155 | |
| 3D Visual Grounding | Nr3D (test) | Overall Success Rate64.4 | 88 | |
| 3D Visual Grounding | Nr3D | Overall Success Rate64.4 | 74 | |
| 3D Visual Grounding | Sr3D (test) | Overall Accuracy72.8 | 73 | |
| Visual Grounding | ScanRefer v1 (val) | -- | 30 | |
| 3D Visual Grounding | ScanRefer (test) | Unique Accuracy81.6 | 21 | |
| 3D Object Grounding | ScanRefer detected proposals v1 (val) | Unique Acc@0.2581.58 | 10 | |
| 3D Visual Grounding | ScanRefer ScanNet v2 (val) | Unique Acc92 | 5 | |
| 3D Visual Grounding | ARKitScenes (test) | Unique Success Rate57.2 | 5 | |
| 3D Object Grounding | ScanRefer ground-truth object proposals | Overall Grounding Accuracy59.8 | 4 |