Our new X account is live! Follow @wizwand_team for updates
WorkDL logo mark

Learned Image Compression with Dictionary-based Entropy Model

About

Learned image compression methods have attracted great research interest and exhibited superior rate-distortion performance to the best classical image compression standards of the present. The entropy model plays a key role in learned image compression, which estimates the probability distribution of the latent representation for further entropy coding. Most existing methods employed hyper-prior and auto-regressive architectures to form their entropy models. However, they only aimed to explore the internal dependencies of latent representation while neglecting the importance of extracting prior from training data. In this work, we propose a novel entropy model named Dictionary-based Cross Attention Entropy model, which introduces a learnable dictionary to summarize the typical structures occurring in the training dataset to enhance the entropy model. Extensive experimental results have demonstrated that the proposed model strikes a better balance between performance and latency, achieving state-of-the-art results on various benchmark datasets.

Jingbo Lu, Leheng Zhang, Xingyu Zhou, Mu Li, Wen Li, Shuhang Gu• 2025

Related benchmarks

TaskDatasetResultRank
Image CompressionKodak--
50
Image CompressionTecnick--
36
Image CompressionCLIC
BD-Rate (PSNR)-19.7
16
Showing 3 of 3 rows

Other info

Code

Follow for update