Share your thoughts, 1 month free Claude Pro on usSee more
WorkDL logo mark

Haiku to Opus in Just 10 bits: LLMs Unlock Massive Compression Gains

About

We study the compression of LLM-generated text across lossless and lossy regimes, characterizing a compression-compute frontier where more compression is possible at the cost of more compute. For lossless compression, domain-adapted LoRA adapters can improve LLM-based arithmetic coding by 2x over compression with the base LLM alone. For lossy compression, prompting a model for a succinct rewrite then applying arithmetic coding can achieve compression ratios of approximately 0.03, a 2x improvement over compressing the original response. We further introduce Question-Asking compression (QA), an interactive lossy protocol inspired by the game 'Twenty Questions'. A small model iteratively refines its response by asking yes/no questions to a stronger model, transferring exactly one bit per answer. On 8 benchmarks spanning math, science, and code, 10 binary questions recover 23% to 72% of the capability gap between a small and large model on standard benchmarks and 7% to 38% on harder benchmarks, achieving compression ratios of 0.0006 to 0.004. This is over 100x smaller than prior LLM-based compression (Deletang et al., 2024), suggesting that interactive protocols can transfer knowledge far more efficiently than transmitting full responses.

Roy Rinberg, Annabelle Michael Carrell, Simon Henniger, Nicholas Carlini, Keri Warr• 2026

Related benchmarks

TaskDatasetResultRank
Text CompressionLMSYS-Chat Overall
Compression Ratio0.09
6
Text CompressionLMSYS-Chat Cluster 0: General Chat
Compression Ratio0.11
6
Text CompressionLMSYS-Chat Cluster 1: Creative Writing
Compression Ratio0.11
6
Text CompressionLMSYS-Chat Cluster 2: Code/Technical
Compression Ratio0.1
6
Text CompressionLMSYS-Chat Cluster 3: Academic/Education
Compression Ratio0.09
6
Text CompressionLMSYS-Chat (Cluster 5: Business/Professional)
Compression Ratio0.09
6
Text CompressionLMSYS-Chat Cluster 6: Philosophy/Ethics
Compression Ratio0.09
6
Text CompressionLMSYS-Chat Cluster 8: Translation Language
Compression Ratio0.08
6
Text CompressionLMSYS-Chat (Cluster 9: Casual Q&A)
Compression Ratio0.1
6
Text CompressionLMSYS-Chat Cluster 4: Roleplay Fiction
Compression Ratio0.08
6
Showing 10 of 27 rows

Other info

Follow for update