Our new X account is live! Follow @wizwand_team for updates
WorkDL logo mark

Echoes Beyond Points: Unleashing the Power of Raw Radar Data in Multi-modality Fusion

About

Radar is ubiquitous in autonomous driving systems due to its low cost and good adaptability to bad weather. Nevertheless, the radar detection performance is usually inferior because its point cloud is sparse and not accurate due to the poor azimuth and elevation resolution. Moreover, point cloud generation algorithms already drop weak signals to reduce the false targets which may be suboptimal for the use of deep fusion. In this paper, we propose a novel method named EchoFusion to skip the existing radar signal processing pipeline and then incorporate the radar raw data with other sensors. Specifically, we first generate the Bird's Eye View (BEV) queries and then take corresponding spectrum features from radar to fuse with other sensors. By this approach, our method could utilize both rich and lossless distance and speed clues from radar echoes and rich semantic clues from images, making our method surpass all existing methods on the RADIal dataset, and approach the performance of LiDAR. The code will be released on https://github.com/tusen-ai/EchoFusion.

Yang Liu, Feng Wang, Naiyan Wang, Zhaoxiang Zhang• 2023

Related benchmarks

TaskDatasetResultRank
3D Object DetectionK-Radar
AP3D Total47.4
12
3D Object DetectionK-Radar (test)
Detection Score (Total)47.4
11
Object DetectionRADIal original protocol (test)
AP96.95
5
3D Object DetectionKRadar (test)
AP@0.3 (3D)68.35
3
BEV Object DetectionKRadar (test)
AP (BEV) @ IoU=0.369.95
3
3D Object DetectionRADIal refined 3D ground-truth (test)
3D AP@0.5 Overall39.81
3
BEV Object DetectionRADIal refined 3D ground-truth (test)
BEV AP@0.7 (Overall)84.92
2
Showing 7 of 7 rows

Other info

Code

Follow for update