SGS-Intrinsic: Semantic-Invariant Gaussian Splatting for Sparse-View Indoor Inverse Rendering
About
We present SGS-Intrinsic, an indoor inverse rendering framework that works well for sparse-view images. Unlike existing 3D Gaussian Splatting (3DGS) based methods that focus on object-centric reconstruction and fail to work under sparse view settings, our method allows to achieve high-quality geometry reconstruction and accurate disentanglement of material and illumination. The core idea is to construct a dense and geometry-consistent Gaussian semantic field guided by semantic and geometric priors, providing a reliable foundation for subsequent inverse rendering. Building upon this, we perform material-illumination disentanglement by combining a hybrid illumination model and material prior to effectively capture illumination-material interactions. To mitigate the impact of cast shadows and enhance the robustness of material recovery, we introduce illumination-invariant material constraint together with a deshadowing model. Extensive experiments on benchmark datasets show that our method consistently improves both reconstruction fidelity and inverse rendering quality over existing 3DGS-based inverse rendering approaches. Our code is available at https://github.com/GrumpySloths/SGS_Intrinsic.github.io.
Related benchmarks
| Task | Dataset | Result | Rank | |
|---|---|---|---|---|
| Novel View Synthesis | FIPT synthetic dataset | PSNR20.1 | 11 | |
| Novel View Synthesis | MipNeRF | PSNR19.73 | 10 | |
| Inverse Rendering | Interiorverse synthetic indoor scenes | Roughness MSE16.1 | 7 | |
| Novel View Synthesis | Interiorverse synthetic indoor scenes | PSNR21.7 | 7 | |
| Novel View Synthesis (PBR) | Interiorverse synthetic indoor scenes | PSNR20.9 | 7 | |
| Albedo Estimation | TensoIR | PSNR25.65 | 6 | |
| Novel View Synthesis | DL3DV | PSNR19.31 | 6 | |
| Novel View Synthesis for PBR | TensoIR | PSNR26.02 | 6 |