Our new X account is live! Follow @wizwand_team for updates
WorkDL logo mark

Super Odometry: IMU-centric LiDAR-Visual-Inertial Estimator for Challenging Environments

About

We propose Super Odometry, a high-precision multi-modal sensor fusion framework, providing a simple but effective way to fuse multiple sensors such as LiDAR, camera, and IMU sensors and achieve robust state estimation in perceptually-degraded environments. Different from traditional sensor-fusion methods, Super Odometry employs an IMU-centric data processing pipeline, which combines the advantages of loosely coupled methods with tightly coupled methods and recovers motion in a coarse-to-fine manner. The proposed framework is composed of three parts: IMU odometry, visual-inertial odometry, and laser-inertial odometry. The visual-inertial odometry and laser-inertial odometry provide the pose prior to constrain the IMU bias and receive the motion prediction from IMU odometry. To ensure high performance in real-time, we apply a dynamic octree that only consumes 10 % of the running time compared with a static KD-tree. The proposed system was deployed on drones and ground robots, as part of Team Explorer's effort to the DARPA Subterranean Challenge where the team won $1^{st}$ and $2^{nd}$ place in the Tunnel and Urban Circuits, respectively.

Shibo Zhao, Hengrui Zhang, Peng Wang, Lucas Nogueira, Sebastian Scherer• 2021

Related benchmarks

TaskDatasetResultRank
Cross-view geo-localizationTartanDrive Seen routes TD01–06 2.0 (seen)
Median Distance Error (TD01)5.67
4
Cross-view geo-localizationTartanDrive Unseen routes TD07–22 2.0
Route TD07 Result8.76
4
Showing 2 of 2 rows

Other info

Follow for update