Our new X account is live! Follow @wizwand_team for updates
WorkDL logo mark

Neural Inverse Rendering from Propagating Light

About

We present the first system for physically based, neural inverse rendering from multi-viewpoint videos of propagating light. Our approach relies on a time-resolved extension of neural radiance caching -- a technique that accelerates inverse rendering by storing infinite-bounce radiance arriving at any point from any direction. The resulting model accurately accounts for direct and indirect light transport effects and, when applied to captured measurements from a flash lidar system, enables state-of-the-art 3D reconstruction in the presence of strong indirect light. Further, we demonstrate view synthesis of propagating light, automatic decomposition of captured measurements into direct and indirect components, as well as novel capabilities such as multi-view time-resolved relighting of captured scenes.

Anagh Malik, Benjamin Attal, Andrew Xie, Matthew O'Toole, David B. Lindell• 2025

Related benchmarks

TaskDatasetResultRank
View synthesis of integrated (steady-state) lidar imagesCaptured real
PSNR (dB)30.99
7
Geometry RecoveryCaptured multi-viewpoint dataset
MAE8.45
4
Geometry ReconstructionSimulated (sim)
MAE8.45
3
View synthesis of integrated (steady-state) lidar imagesSimulated
PSNR (dB)30.99
3
View synthesis of time-resolved lidar measurementsSimulated
T-IOU76
3
View synthesis of time-resolved lidar measurementsCaptured real
T-IoU54
3
Showing 6 of 6 rows

Other info

Code

Follow for update