Our new X account is live! Follow @wizwand_team for updates
WorkDL logo mark

Byte Latent Transformer: Patches Scale Better Than Tokens

About

We introduce the Byte Latent Transformer (BLT), a new byte-level LLM architecture that, for the first time, matches tokenization-based LLM performance at scale with significant improvements in inference efficiency and robustness. BLT encodes bytes into dynamically sized patches, which serve as the primary units of computation. Patches are segmented based on the entropy of the next byte, allocating more compute and model capacity where increased data complexity demands it. We present the first FLOP controlled scaling study of byte-level models up to 8B parameters and 4T training bytes. Our results demonstrate the feasibility of scaling models trained on raw bytes without a fixed vocabulary. Both training and inference efficiency improve due to dynamically selecting long patches when data is predictable, along with qualitative improvements on reasoning and long tail generalization. Overall, for fixed inference costs, BLT shows significantly better scaling than tokenization-based models, by simultaneously growing both patch and model size.

Artidoro Pagnoni, Ram Pasunuru, Pedro Rodriguez, John Nguyen, Benjamin Muller, Margaret Li, Chunting Zhou, Lili Yu, Jason Weston, Luke Zettlemoyer, Gargi Ghosh, Mike Lewis, Ari Holtzman, Srinivasan Iyer• 2024

Related benchmarks

TaskDatasetResultRank
Generative Question AnsweringBolmo Evaluation Suite GenQA 7B
GenQA Average0.684
29
Multiple-choice Question AnsweringBolmo Evaluation Suite MC STEM 7B
MC STEM Average Accuracy49
17
Language Modeling EvaluationBolmo 1B evaluation suite
Overall Average Score58.5
5
Character UnderstandingBolmo Character Understanding 7B
Char (Avg)49.3
5
Code GenerationBolmo Evaluation Suite Code 7B
Average Code Score0.316
5
Mathematical ReasoningBolmo Evaluation Suite Math 7B
Avg Math Score15.7
5
Multiple-choice Question AnsweringBolmo 7B Evaluation Suite MC Non-STEM
Average Score (Non-STEM)56.6
5
Showing 7 of 7 rows

Other info

Follow for update