Urban Radiance Fields
About
The goal of this work is to perform 3D reconstruction and novel view synthesis from data captured by scanning platforms commonly deployed for world mapping in urban outdoor environments (e.g., Street View). Given a sequence of posed RGB images and lidar sweeps acquired by cameras and scanners moving through an outdoor scene, we produce a model from which 3D surfaces can be extracted and novel RGB images can be synthesized. Our approach extends Neural Radiance Fields, which has been demonstrated to synthesize realistic novel images for small scenes in controlled settings, with new methods for leveraging asynchronously captured lidar data, for addressing exposure variation between captured images, and for leveraging predicted image segmentations to supervise densities on rays pointing at the sky. Each of these three extensions provides significant performance improvements in experiments on Street View data. Our system produces state-of-the-art 3D surface reconstructions and synthesizes higher quality novel views in comparison to both traditional methods (e.g.~COLMAP) and recent neural representations (e.g.~Mip-NeRF).
Related benchmarks
| Task | Dataset | Result | Rank | |
|---|---|---|---|---|
| Scene Reconstruction | nuScenes | PSNR20.75 | 17 | |
| Depth Estimation | Captured Lidar Scenes hardware prototype (depth evaluation) | L1 Error (Depth)0.004 | 15 | |
| Novel View Synthesis | Simulated dataset | PSNR (dB)22.34 | 15 | |
| Depth Estimation | Simulated scenes | L1 Error (Depth)0.029 | 15 | |
| Novel View Synthesis | Captured Lidar Scenes novel views hardware prototype | PSNR (dB)19.11 | 15 | |
| Surrounding View Synthesis | NuScenes v1.0 (test) | PSNR20.75 | 11 | |
| Novel View Synthesis | Street View dataset Setting 1 (Held-out Viewpoints) | PSNR20.421 | 6 | |
| LiDAR Novel View Synthesis | Waymo interp. | MAE28.2 | 6 | |
| LiDAR Novel View Synthesis | TownClean | MAE43.3 | 6 | |
| LiDAR Novel View Synthesis | TownReal | MAE52.1 | 6 |