Share your thoughts, 1 month free Claude Pro on usSee more
WorkDL logo mark

Ref-NeRF: Structured View-Dependent Appearance for Neural Radiance Fields

About

Neural Radiance Fields (NeRF) is a popular view synthesis technique that represents a scene as a continuous volumetric function, parameterized by multilayer perceptrons that provide the volume density and view-dependent emitted radiance at each location. While NeRF-based techniques excel at representing fine geometric structures with smoothly varying view-dependent appearance, they often fail to accurately capture and reproduce the appearance of glossy surfaces. We address this limitation by introducing Ref-NeRF, which replaces NeRF's parameterization of view-dependent outgoing radiance with a representation of reflected radiance and structures this function using a collection of spatially-varying scene properties. We show that together with a regularizer on normal vectors, our model significantly improves the realism and accuracy of specular reflections. Furthermore, we show that our model's internal representation of outgoing radiance is interpretable and useful for scene editing.

Dor Verbin, Peter Hedman, Ben Mildenhall, Todd Zickler, Jonathan T. Barron, Pratul P. Srinivasan• 2021

Related benchmarks

TaskDatasetResultRank
View synthesis qualityNeRF Synthetic v1 (test)
PSNR33.99
45
Novel View SynthesisBlender (test)
PSNR33.99
37
Novel View SynthesisShiny
PSNR26.502
28
Novel View SynthesisNeRF-synthetic original (test)
PSNR31.29
25
Novel View SynthesisSynthetic dynamic scenes
PSNR35.88
19
Reflective Object ReconstructionGlossy Synthetic
PSNR27.5
19
Reflective Object ReconstructionShiny Blender
PSNR33.12
18
Novel View SynthesisShiny Blender
PSNR35.96
13
Surface ReconstructionShinySynthetic
MAE (Ball)1.55
13
Novel View SynthesisShinySynthetic
PSNR33.32
12
Showing 10 of 47 rows

Other info

Follow for update