Ref-NeuS: Ambiguity-Reduced Neural Implicit Surface Learning for Multi-View Reconstruction with Reflection
About
Neural implicit surface learning has shown significant progress in multi-view 3D reconstruction, where an object is represented by multilayer perceptrons that provide continuous implicit surface representation and view-dependent radiance. However, current methods often fail to accurately reconstruct reflective surfaces, leading to severe ambiguity. To overcome this issue, we propose Ref-NeuS, which aims to reduce ambiguity by attenuating the effect of reflective surfaces. Specifically, we utilize an anomaly detector to estimate an explicit reflection score with the guidance of multi-view context to localize reflective surfaces. Afterward, we design a reflection-aware photometric loss that adaptively reduces ambiguity by modeling rendered color as a Gaussian distribution, with the reflection score representing the variance. We show that together with a reflection direction-dependent radiance, our model achieves high-quality surface reconstruction on reflective surfaces and outperforms the state-of-the-arts by a large margin. Besides, our model is also comparable on general surfaces.
Related benchmarks
| Task | Dataset | Result | Rank | |
|---|---|---|---|---|
| Surface Reconstruction | DTU | Chamfer Distance (CD)1.93 | 120 | |
| View Synthesis and Surface Reconstruction | Shiny Blender | PSNR27.4 | 11 | |
| 3D Reconstruction and Rendering | RedOx | PSNR27.21 | 9 | |
| 3D Reconstruction and Rendering | Lays | PSNR27.28 | 9 | |
| 3D Reconstruction and Rendering | GreenOx | PSNR27.35 | 9 | |
| 3D Reconstruction and Rendering | Horse | PSNR23.45 | 9 | |
| 3D Reconstruction and Rendering | Cat | PSNR23.27 | 9 |