Our new X account is live! Follow @wizwand_team for updates
WorkDL logo mark

Which Pieces Does Unigram Tokenization Really Need?

About

The Unigram tokenization algorithm offers a probabilistic alternative to the greedy heuristics of Byte-Pair Encoding. Despite its theoretical elegance, its implementation in practice is complex, limiting its adoption to the SentencePiece package and adapters thereof. We bridge this gap between theory and practice by providing a clear guide to implementation and parameter choices. We also identify a simpler algorithm that accepts slightly higher training loss in exchange for improved compression.

Sander Land, Yuval Pinter• 2025

Related benchmarks

TaskDatasetResultRank
Compression300 MB Monolingual Corpora
Train Tokens58.2
9
Morphological AlignmentEnglish 300 MB Corpora
Morph. Score55.9
9
Showing 2 of 2 rows

Other info

Follow for update