Share your thoughts, 1 month free Claude Pro on usSee more
WorkDL logo mark

Open-Vocabulary Affordance Detection in 3D Point Clouds

About

Affordance detection is a challenging problem with a wide variety of robotic applications. Traditional affordance detection methods are limited to a predefined set of affordance labels, hence potentially restricting the adaptability of intelligent robots in complex and dynamic environments. In this paper, we present the Open-Vocabulary Affordance Detection (OpenAD) method, which is capable of detecting an unbounded number of affordances in 3D point clouds. By simultaneously learning the affordance text and the point feature, OpenAD successfully exploits the semantic relationships between affordances. Therefore, our proposed method enables zero-shot detection and can be able to detect previously unseen affordances without a single annotation example. Intensive experimental results show that OpenAD works effectively on a wide range of affordance detection setups and outperforms other baselines by a large margin. Additionally, we demonstrate the practicality of the proposed OpenAD in real-world robotic applications with a fast inference speed (~100ms). Our project is available at https://openad2023.github.io.

Toan Nguyen, Minh Nhat Vu, An Vuong, Dzung Nguyen, Thieu Vo, Ngan Le, Anh Nguyen• 2023

Related benchmarks

TaskDatasetResultRank
3D Affordance GroundingAffogato (test)
aIoU3.1
14
3D Affordance GroundingPIAD Unseen Object v2
aIOU16.62
7
3D Affordance GroundingPIAD Unseen Affordance v2
aIOU8
7
3D Affordance GroundingPIAD v2 (Seen)
mIoU31.88
7
3D Object Affordance GroundingAGPIL Full-view (Seen)
AUC85.84
5
3D Object Affordance GroundingAGPIL Partial-view (seen)
AUC0.8145
5
3D Object Affordance GroundingAGPIL Rotation-view (Seen)
AUC0.7325
5
3D Object Affordance GroundingAGPIL Full-view (Unseen)
AUC73.75
5
3D Object Affordance GroundingAGPIL Partial-view (Unseen)
AUC70.14
5
3D Object Affordance GroundingAGPIL Rotation-view (Unseen)
AUC59.89
5
Showing 10 of 10 rows

Other info

Follow for update