Share your thoughts, 1 month free Claude Pro on usSee more
WorkDL logo mark

SGS-Intrinsic: Semantic-Invariant Gaussian Splatting for Sparse-View Indoor Inverse Rendering

About

We present SGS-Intrinsic, an indoor inverse rendering framework that works well for sparse-view images. Unlike existing 3D Gaussian Splatting (3DGS) based methods that focus on object-centric reconstruction and fail to work under sparse view settings, our method allows to achieve high-quality geometry reconstruction and accurate disentanglement of material and illumination. The core idea is to construct a dense and geometry-consistent Gaussian semantic field guided by semantic and geometric priors, providing a reliable foundation for subsequent inverse rendering. Building upon this, we perform material-illumination disentanglement by combining a hybrid illumination model and material prior to effectively capture illumination-material interactions. To mitigate the impact of cast shadows and enhance the robustness of material recovery, we introduce illumination-invariant material constraint together with a deshadowing model. Extensive experiments on benchmark datasets show that our method consistently improves both reconstruction fidelity and inverse rendering quality over existing 3DGS-based inverse rendering approaches. Our code is available at https://github.com/GrumpySloths/SGS_Intrinsic.github.io.

Jiahao Niu, Rongjia Zheng, Wenju Xu, Wei-Shi Zheng, Qing Zhang• 2026

Related benchmarks

TaskDatasetResultRank
Novel View SynthesisFIPT synthetic dataset
PSNR20.1
11
Novel View SynthesisMipNeRF
PSNR19.73
10
Inverse RenderingInteriorverse synthetic indoor scenes
Roughness MSE16.1
7
Novel View SynthesisInteriorverse synthetic indoor scenes
PSNR21.7
7
Novel View Synthesis (PBR)Interiorverse synthetic indoor scenes
PSNR20.9
7
Albedo EstimationTensoIR
PSNR25.65
6
Novel View SynthesisDL3DV
PSNR19.31
6
Novel View Synthesis for PBRTensoIR
PSNR26.02
6
Showing 8 of 8 rows

Other info

Follow for update