Our new X account is live! Follow @wizwand_team for updates
WorkDL logo mark

End-to-end Interpretable Neural Motion Planner

About

In this paper, we propose a neural motion planner (NMP) for learning to drive autonomously in complex urban scenarios that include traffic-light handling, yielding, and interactions with multiple road-users. Towards this goal, we design a holistic model that takes as input raw LIDAR data and a HD map and produces interpretable intermediate representations in the form of 3D detections and their future trajectories, as well as a cost volume defining the goodness of each position that the self-driving car can take within the planning horizon. We then sample a set of diverse physically possible trajectories and choose the one with the minimum learned cost. Importantly, our cost volume is able to naturally capture multi-modality. We demonstrate the effectiveness of our approach in real-world driving data captured in several cities in North America. Our experiments show that the learned cost volume can generate safer planning than all the baselines.

Wenyuan Zeng, Wenjie Luo, Simon Suo, Abbas Sadat, Bin Yang, Sergio Casas, Raquel Urtasun• 2021

Related benchmarks

TaskDatasetResultRank
Open-loop planningnuScenes (val)
L2 Error (3s)2.05
151
PlanningnuScenes v1.0-trainval (val)--
39
Trajectory PlanningnuScenes 1.0 (test)--
14
Motion PlanningnuScenes v1.0 (val)
L2 Error (3s)2.05
9
Motion PlanningUrbanScenarios Original v1.0 (val)
Collision Rate (3s)2.6
4
Motion PlanningUrbanScenarios AdvSim generated scenarios v1.0 (val)
Collision Rate (3s)14.2
4
Trajectory PredictionUrbanScenarios Original
L2 Center Error @3s1.43
2
Trajectory PredictionUrbanScenarios AdvSim generated scenarios
L2 Center Error @3s1.63
2
Object PerceptionUrbanScenarios Original
AP (IoU=0.7)81.7
2
Object PerceptionUrbanScenarios AdvSim generated scenarios
AP (IoU=0.7)72.7
2
Showing 10 of 10 rows

Other info

Follow for update