Our new X account is live! Follow @wizwand_team for updates
WorkDL logo mark

Leveraging Second-Order Curvature for Efficient Learned Image Compression: Theory and Empirical Evidence

About

Training learned image compression (LIC) models entails navigating a challenging optimization landscape defined by the fundamental trade-off between rate and distortion. Standard first-order optimizers, such as SGD and Adam, struggle with \emph{gradient conflicts} arising from competing objectives, leading to slow convergence and suboptimal rate-distortion performance. In this work, we demonstrate that a simple utilization of a second-order quasi-Newton optimizer, \textbf{SOAP}, dramatically improves both training efficiency and final performance across diverse LICs. Our theoretical and empirical analyses reveal that Newton preconditioning inherently resolves the intra-step and inter-step update conflicts intrinsic to the R-D objective, facilitating faster, more stable convergence. Beyond acceleration, we uncover a critical deployability benefit: second-order trained models exhibit significantly fewer activation and latent outliers. This substantially enhances robustness to post-training quantization. Together, these results establish second-order optimization, achievable as a seamless drop-in replacement of the imported optimizer, as a powerful, practical tool for advancing the efficiency and real-world readiness of LICs.

Yichi Zhang, Fengqing Zhu• 2026

Related benchmarks

TaskDatasetResultRank
Image CompressionKodak--
50
Image CompressionTecnick--
36
Image CompressionCLIC 2022
BD-Rate-3.22
6
Showing 3 of 3 rows

Other info

Follow for update