Share your thoughts, 1 month free Claude Pro on usSee more
WorkDL logo mark

NeRF On-the-go: Exploiting Uncertainty for Distractor-free NeRFs in the Wild

About

Neural Radiance Fields (NeRFs) have shown remarkable success in synthesizing photorealistic views from multi-view images of static scenes, but face challenges in dynamic, real-world environments with distractors like moving objects, shadows, and lighting changes. Existing methods manage controlled environments and low occlusion ratios but fall short in render quality, especially under high occlusion scenarios. In this paper, we introduce NeRF On-the-go, a simple yet effective approach that enables the robust synthesis of novel views in complex, in-the-wild scenes from only casually captured image sequences. Delving into uncertainty, our method not only efficiently eliminates distractors, even when they are predominant in captures, but also achieves a notably faster convergence speed. Through comprehensive experiments on various scenes, our method demonstrates a significant improvement over state-of-the-art techniques. This advancement opens new avenues for NeRF in diverse and dynamic real-world applications.

Weining Ren, Zihan Zhu, Boyang Sun, Jiaqi Chen, Marc Pollefeys, Songyou Peng• 2024

Related benchmarks

TaskDatasetResultRank
Novel View SynthesisD-RE10K static regions only (test)
PSNR19.52
26
Novel View SynthesisD-RE10K-iPhone full-image fidelity (test)
PSNR17.53
26
Novel View SynthesisRobustNeRF Baby Yoda scene
LPIPS0.068
20
Novel View SynthesisRobustNeRF Android
PSNR24.42
17
Novel View SynthesisRobustNeRF Statue
PSNR22.34
17
Novel View SynthesisRobustNeRF Crab
PSNR29.65
16
Novel View SynthesisOn-the-go Dataset
PSNR (Mountain)21.27
12
Novel View SynthesisRobustNeRF Avg.
PSNR26.76
12
reconstruction qualityFullCircle Perspective undistorted fisheye
Room 1 Reconstruction Quality80
9
Novel View SynthesisKubric Cars
LPIPS0.035
6
Showing 10 of 33 rows

Other info

Code

Follow for update