Share your thoughts, 1 month free Claude Pro on usSee more
WorkDL logo mark

FreeNeRF: Improving Few-shot Neural Rendering with Free Frequency Regularization

About

Novel view synthesis with sparse inputs is a challenging problem for neural radiance fields (NeRF). Recent efforts alleviate this challenge by introducing external supervision, such as pre-trained models and extra depth signals, and by non-trivial patch-based rendering. In this paper, we present Frequency regularized NeRF (FreeNeRF), a surprisingly simple baseline that outperforms previous methods with minimal modifications to the plain NeRF. We analyze the key challenges in few-shot neural rendering and find that frequency plays an important role in NeRF's training. Based on the analysis, we propose two regularization terms. One is to regularize the frequency range of NeRF's inputs, while the other is to penalize the near-camera density fields. Both techniques are ``free lunches'' at no additional computational cost. We demonstrate that even with one line of code change, the original NeRF can achieve similar performance as other complicated methods in the few-shot setting. FreeNeRF achieves state-of-the-art performance across diverse datasets, including Blender, DTU, and LLFF. We hope this simple baseline will motivate a rethinking of the fundamental role of frequency in NeRF's training under the low-data regime and beyond.

Jiawei Yang, Marco Pavone, Yue Wang• 2023

Related benchmarks

TaskDatasetResultRank
Novel View SynthesisTanks&Temples (test)
PSNR14.34
257
Novel View SynthesisRealEstate10K
PSNR27.32
173
Novel View SynthesisMip-NeRF360
PSNR14.59
138
Novel View SynthesisLLFF
PSNR25.12
130
Novel View SynthesisLLFF 3-view
PSNR19.71
130
Novel View SynthesisDTU
PSNR25.56
115
Novel View SynthesisNeRF Synthetic--
110
Novel View SynthesisLLFF 6-view
PSNR23.73
105
Novel View SynthesisDTU (test)
PSNR25.38
101
Novel View SynthesisLLFF 9-view
PSNR25.13
97
Showing 10 of 75 rows
...

Other info

Follow for update