Our new X account is live! Follow @wizwand_team for updates
WorkDL logo mark

Point-SLAM: Dense Neural Point Cloud-based SLAM

About

We propose a dense neural simultaneous localization and mapping (SLAM) approach for monocular RGBD input which anchors the features of a neural scene representation in a point cloud that is iteratively generated in an input-dependent data-driven manner. We demonstrate that both tracking and mapping can be performed with the same point-based neural scene representation by minimizing an RGBD-based re-rendering loss. In contrast to recent dense neural SLAM methods which anchor the scene features in a sparse grid, our point-based approach allows dynamically adapting the anchor point density to the information density of the input. This strategy reduces runtime and memory usage in regions with fewer details and dedicates higher point density to resolve fine details. Our approach performs either better or competitive to existing dense neural RGBD SLAM methods in tracking, mapping and rendering accuracy on the Replica, TUM-RGBD and ScanNet datasets. The source code is available at https://github.com/eriksandstroem/Point-SLAM.

Erik Sandstr\"om, Yue Li, Luc Van Gool, Martin R. Oswald• 2023

Related benchmarks

TaskDatasetResultRank
Camera pose estimationScanNet--
61
Absolute Trajectory EstimationTUM RGB-D
Desk Error0.043
23
Visual SLAMTUM RGB-D fr1 desk--
21
Visual SLAMTUM RGB-D fr2 xyz--
21
TrackingTUM RGBD (test)
fr1/desk Error2.73
18
Camera TrackingTUM RGB-D fr2 xyz
ATE RMSE0.0131
16
Camera TrackingTUM RGB-D fr3 office
ATE RMSE0.0348
16
Camera TrackingTUM RGB-D fr1 desk
ATE RMSE0.0434
16
Camera TrackingTUM RGB-D
Tracking Error (fr1/desk)2.73
16
Camera TrackingReplica
Rotation Error (rm-0)0.61
14
Showing 10 of 35 rows

Other info

Follow for update