Our new X account is live! Follow @wizwand_team for updates
WorkDL logo mark

Learning Hierarchical Sparse Transform Coding for 3DGS Compression

About

Current 3DGS compression methods largely forego the neural analysis-synthesis transform, which is a crucial component in learned signal compression systems. As a result, redundancy removal is left solely to the entropy coder, overburdening the entropy coding module and reducing rate-distortion (R-D) performance. To fix this critical omission, we propose a training-time transform coding (TTC) method that adds the analysis-synthesis transform and optimizes it jointly with the 3DGS representation and entropy model. Concretely, we adopt a hierarchical design: a channel-wise KLT for decorrelation and energy compaction, followed by a sparsity-aware neural transform that reconstructs the KLT residuals with minimal parameter and computational overhead. Experiments show that our method delivers strong R-D performance with fast decoding, offering a favorable BD-rate-decoding-time trade-off over SOTA 3DGS compressors.

Hao Xu, Xiaolin Wu, Xi Zhang• 2025

Related benchmarks

TaskDatasetResultRank
3D ReconstructionMip-NeRF 360
SSIM0.804
37
3D Scene ReconstructionDeepBlending
PSNR30.19
30
3D Scene ReconstructionTank & Temples
PSNR24.34
26
3D Gaussian Splatting CompressionMip-NeRF360 (test)
BD-Rate-64.82
5
3DGS CompressionMip-NeRF360
BD-rate vs HAC++-20.81
1
3DGS CompressionTank & Temples
BD-rate vs HAC++-22.55
1
3DGS CompressionDeepBlending
BD-rate vs HAC++-19.58
1
3DGS CompressionBungeeNeRF
BD-rate (vs HAC++)-10.04
1
3DGS CompressionSynthetic-NeRF
BD-rate vs HAC++-13.45
1
Showing 9 of 9 rows

Other info

Follow for update