Our new X account is live! Follow @wizwand_team for updates
WorkDL logo mark

NeILF: Neural Incident Light Field for Physically-based Material Estimation

About

We present a differentiable rendering framework for material and lighting estimation from multi-view images and a reconstructed geometry. In the framework, we represent scene lightings as the Neural Incident Light Field (NeILF) and material properties as the surface BRDF modelled by multi-layer perceptrons. Compared with recent approaches that approximate scene lightings as the 2D environment map, NeILF is a fully 5D light field that is capable of modelling illuminations of any static scenes. In addition, occlusions and indirect lights can be handled naturally by the NeILF representation without requiring multiple bounces of ray tracing, making it possible to estimate material properties even for scenes with complex lightings and geometries. We also propose a smoothness regularization and a Lambertian assumption to reduce the material-lighting ambiguity during the optimization. Our method strictly follows the physically-based rendering equation, and jointly optimizes material and lighting through the differentiable rendering process. We have intensively evaluated the proposed method on our in-house synthetic dataset, the DTU MVS dataset, and real-world BlendedMVS scenes. Our method is able to outperform previous methods by a significant margin in terms of novel view rendering quality, setting a new state-of-the-art for image-based material and lighting estimation.

Yao Yao, Jingyang Zhang, Jingbo Liu, Yihang Qu, Tian Fang, David McKinnon, Yanghai Tsin, Long Quan• 2022

Related benchmarks

TaskDatasetResultRank
Albedo EstimationSynthetic dataset
PSNR17.0707
6
Roughness EstimationSynthetic dataset
PSNR11.5778
6
Novel View SynthesisSynthetic dataset novel views
PSNR22.3765
5
Re-renderingSynthetic dataset optimization views
PSNR25.1092
5
Image Re-renderingAuthors' Real Dataset
PSNR21.926
4
Showing 5 of 5 rows

Other info

Follow for update