Our new X account is live! Follow @wizwand_team for updates
WorkDL logo mark

Transient Neural Radiance Fields for Lidar View Synthesis and 3D Reconstruction

About

Neural radiance fields (NeRFs) have become a ubiquitous tool for modeling scene appearance and geometry from multiview imagery. Recent work has also begun to explore how to use additional supervision from lidar or depth sensor measurements in the NeRF framework. However, previous lidar-supervised NeRFs focus on rendering conventional camera imagery and use lidar-derived point cloud data as auxiliary supervision; thus, they fail to incorporate the underlying image formation model of the lidar. Here, we propose a novel method for rendering transient NeRFs that take as input the raw, time-resolved photon count histograms measured by a single-photon lidar system, and we seek to render such histograms from novel views. Different from conventional NeRFs, the approach relies on a time-resolved version of the volume rendering equation to render the lidar measurements and capture transient light transport phenomena at picosecond timescales. We evaluate our method on a first-of-its-kind dataset of simulated and captured transient multiview scans from a prototype single-photon lidar. Overall, our work brings NeRFs to a new dimension of imaging at transient timescales, newly enabling rendering of transient imagery from novel views. Additionally, we show that our approach recovers improved geometry and conventional appearance compared to point cloud-based supervision when training on few input viewpoints. Transient NeRFs may be especially useful for applications which seek to simulate raw lidar measurements for downstream tasks in autonomous driving, robotics, and remote sensing.

Anagh Malik, Parsa Mirdehghan, Sotiris Nousias, Kiriakos N. Kutulakos, David B. Lindell• 2023

Related benchmarks

TaskDatasetResultRank
Depth EstimationSimulated scenes
L1 Error (Depth)0.011
15
Novel View SynthesisCaptured Lidar Scenes novel views hardware prototype
PSNR (dB)22.72
15
Novel View SynthesisSimulated dataset
PSNR (dB)28.39
15
Depth EstimationCaptured Lidar Scenes hardware prototype (depth evaluation)
L1 Error (Depth)0.005
15
View synthesis of integrated (steady-state) lidar imagesCaptured real
PSNR (dB)24.52
7
Depth EstimationCaptured scenes
L1 Error (2 views)0.006
5
Geometry RecoveryCaptured multi-viewpoint dataset
MAE22.54
4
Geometry ReconstructionSimulated (sim)
MAE28
3
View synthesis of integrated (steady-state) lidar imagesSimulated
PSNR (dB)22.44
3
View synthesis of time-resolved lidar measurementsSimulated
T-IOU58
3
Showing 10 of 11 rows

Other info

Code

Follow for update