Share your thoughts, 1 month free Claude Pro on usSee more
WorkDL logo mark

Binocular-Guided 3D Gaussian Splatting with View Consistency for Sparse View Synthesis

About

Novel view synthesis from sparse inputs is a vital yet challenging task in 3D computer vision. Previous methods explore 3D Gaussian Splatting with neural priors (e.g. depth priors) as an additional supervision, demonstrating promising quality and efficiency compared to the NeRF based methods. However, the neural priors from 2D pretrained models are often noisy and blurry, which struggle to precisely guide the learning of radiance fields. In this paper, We propose a novel method for synthesizing novel views from sparse views with Gaussian Splatting that does not require external prior as supervision. Our key idea lies in exploring the self-supervisions inherent in the binocular stereo consistency between each pair of binocular images constructed with disparity-guided image warping. To this end, we additionally introduce a Gaussian opacity constraint which regularizes the Gaussian locations and avoids Gaussian redundancy for improving the robustness and efficiency of inferring 3D Gaussians from sparse views. Extensive experiments on the LLFF, DTU, and Blender datasets demonstrate that our method significantly outperforms the state-of-the-art methods.

Liang Han, Junsheng Zhou, Yu-Shen Liu, Zhizhong Han• 2024

Related benchmarks

TaskDatasetResultRank
Novel View SynthesisLLFF 3-view
PSNR21.44
130
Novel View SynthesisLLFF
PSNR21.44
130
Novel View SynthesisDTU
PSNR20.71
115
Novel View SynthesisLLFF 6-view
PSNR24.87
105
Novel View SynthesisDTU (test)
PSNR26.7
101
Novel View SynthesisLLFF 9-view
PSNR26.17
97
Novel View SynthesisDTU 3-view
PSNR20.71
58
Novel View SynthesisDTU 6-view
PSNR24.31
58
Novel View SynthesisLLFF 3-view (test)
PSNR21.44
39
Novel View SynthesisBlender (test)
PSNR24.71
37
Showing 10 of 14 rows

Other info

Code

Follow for update