Our new X account is live! Follow @wizwand_team for updates
WorkDL logo mark

Modeling Indirect Illumination for Inverse Rendering

About

Recent advances in implicit neural representations and differentiable rendering make it possible to simultaneously recover the geometry and materials of an object from multi-view RGB images captured under unknown static illumination. Despite the promising results achieved, indirect illumination is rarely modeled in previous methods, as it requires expensive recursive path tracing which makes the inverse rendering computationally intractable. In this paper, we propose a novel approach to efficiently recovering spatially-varying indirect illumination. The key insight is that indirect illumination can be conveniently derived from the neural radiance field learned from input images instead of being estimated jointly with direct illumination and materials. By properly modeling the indirect illumination and visibility of direct illumination, interreflection- and shadow-free albedo can be recovered. The experiments on both synthetic and real data demonstrate the superior performance of our approach compared to previous work and its capability to synthesize realistic renderings under novel viewpoints and illumination. Our code and data are available at https://zju3dv.github.io/invrender/.

Yuanqing Zhang, Jiaming Sun, Xingyi He, Huan Fu, Rongfei Jia, Xiaowei Zhou• 2022

Related benchmarks

TaskDatasetResultRank
Novel Scene RelightingStanford-ORB 1.0 (test)
PSNR-H23.76
26
RelightingSynthetic Scenes (test)
PSNR25.5934
16
Albedo EstimationMII dataset
PSNR25.77
14
Novel View SynthesisSynthetic Dataset (test)
PSNR30.8743
13
Albedo EstimationSynthetic Dataset (test)
PSNR27.4026
10
Novel View SynthesisTensoIR Synthetic
PSNR30.92
10
3D Reconstruction and RenderingHorse
PSNR24.92
9
3D Reconstruction and RenderingGreenOx
PSNR27.32
9
3D Reconstruction and RenderingLays
PSNR25.61
9
3D Reconstruction and RenderingRedOx
PSNR22.47
9
Showing 10 of 41 rows

Other info

Code

Follow for update