Our new X account is live! Follow @wizwand_team for updates
WorkDL logo mark

EVPGS: Enhanced View Prior Guidance for Splatting-based Extrapolated View Synthesis

About

Gaussian Splatting (GS)-based methods rely on sufficient training view coverage and perform synthesis on interpolated views. In this work, we tackle the more challenging and underexplored Extrapolated View Synthesis (EVS) task. Here we enable GS-based models trained with limited view coverage to generalize well to extrapolated views. To achieve our goal, we propose a view augmentation framework to guide training through a coarse-to-fine process. At the coarse stage, we reduce rendering artifacts due to insufficient view coverage by introducing a regularization strategy at both appearance and geometry levels. At the fine stage, we generate reliable view priors to provide further training guidance. To this end, we incorporate an occlusion awareness into the view prior generation process, and refine the view priors with the aid of coarse stage output. We call our framework Enhanced View Prior Guidance for Splatting (EVPGS). To comprehensively evaluate EVPGS on the EVS task, we collect a real-world dataset called Merchandise3D dedicated to the EVS scenario. Experiments on three datasets including both real and synthetic demonstrate EVPGS achieves state-of-the-art performance, while improving synthesis quality at extrapolated views for GS-based methods both qualitatively and quantitatively. We will make our code, dataset, and models public.

Jiahe Li, Feiyu Wang, Xiaochao Qu, Chengjing Wu, Luoqi Liu, Ting Liu• 2025

Related benchmarks

TaskDatasetResultRank
Novel View SynthesisDTU
PSNR26.488
100
Novel View SynthesisNeRF Synthetic
PSNR27.849
92
Novel View SynthesisMip-NeRF360 (test)
PSNR19.446
58
Novel View SynthesisSynthetic-NeRF (test)
PSNR27.849
48
Novel View SynthesisDTU 15 (test)
PSNR26.488
15
Novel View SynthesisMerchandise3D (test)
PSNR25.136
11
Novel View SynthesisMerchandise3D
PSNR25.136
4
Showing 7 of 7 rows

Other info

Follow for update