Share your thoughts, 1 month free Claude Pro on usSee more
WorkDL logo mark

Gaussian Mapping for Evolving Scenes

About

Mapping systems with novel view synthesis (NVS) capabilities, most notably 3D Gaussian Splatting (3DGS), are widely used in computer vision, as well as in various applications, including augmented reality, robotics, and autonomous driving. However, many current approaches are limited to static scenes. While recent works have begun addressing short-term dynamics (motion within the camera's view), long-term dynamics (the scene evolving through changes out of view) remain less explored. To overcome this limitation, we introduce a dynamic scene adaptation mechanism to continuously update 3DGS to reflect the latest changes. Since maintaining consistency remains challenging due to stale observations disrupting the reconstruction process, we further propose a novel keyframe management mechanism that discards outdated observations while preserving as much information as possible. We thoroughly evaluate Gaussian Mapping for Evolving Scenes (GaME) on both synthetic and real-world datasets, achieving a 29.7% improvement in PSNR and a 3 times improvement in L1 depth error over the most competitive baseline.

Vladimir Yugay, Thies Kersten, Luca Carlone, Theo Gevers, Martin R. Oswald, Lukas Schmid• 2025

Related benchmarks

TaskDatasetResultRank
Evolving Scene MappingFlat input views synthetic
PSNR (dB)24.55
4
Evolving Scene MappingFlat synthetic (novel views)
PSNR (dB)24.26
4
Novel View SynthesisAria room0
PSNR (Input View) [dB]31.54
4
Novel View SynthesisAria room1
PSNR (Input View) [dB]31.23
4
Novel View SynthesisAria Average
PSNR (Input)31.39
4
Online MappingFlat dataset
FPS0.52
4
Rendering PerformanceTUM-RGBD
Rendering Time (ms)20.18
3
Showing 7 of 7 rows

Other info

Follow for update