Our new X account is live! Follow @wizwand_team for updates
WorkDL logo mark

A Step to Decouple Optimization in 3DGS

About

3D Gaussian Splatting (3DGS) has emerged as a powerful technique for real-time novel view synthesis. As an explicit representation optimized through gradient propagation among primitives, optimization widely accepted in deep neural networks (DNNs) is actually adopted in 3DGS, such as synchronous weight updating and Adam with the adaptive gradient. However, considering the physical significance and specific design in 3DGS, there are two overlooked details in the optimization of 3DGS: (i) update step coupling, which induces optimizer state rescaling and costly attribute updates outside the viewpoints, and (ii) gradient coupling in the moment, which may lead to under- or over-effective regularization. Nevertheless, such a complex coupling is under-explored. After revisiting the optimization of 3DGS, we take a step to decouple it and recompose the process into: Sparse Adam, Re-State Regularization and Decoupled Attribute Regularization. Taking a large number of experiments under the 3DGS and 3DGS-MCMC frameworks, our work provides a deeper understanding of these components. Finally, based on the empirical analysis, we re-design the optimization and propose AdamW-GS by re-coupling the beneficial components, under which better optimization efficiency and representation effectiveness are achieved simultaneously.

Renjie Ding, Yaonan Wang, Min Liu, Jialin Zhu, Jiazheng Wang, Jiahao Zhao, Wenting Shen, Feixiang He, Xiang Chen• 2026

Related benchmarks

TaskDatasetResultRank
Novel View SynthesisMipNeRF 360 Outdoor
PSNR25.247
112
Novel View SynthesisMipNeRF 360 Indoor
PSNR31.934
108
Novel View SynthesisMip-NeRF 360
PSNR28.219
102
Novel View SynthesisDeep Blending
PSNR30.417
5
Novel View SynthesisTanks&Temples
PSNR24.726
5
Showing 5 of 5 rows

Other info

Follow for update