Our new X account is live! Follow @wizwand_team for updates
WorkDL logo mark

Value-Aware Numerical Representations for Transformer Language Models

About

Transformer-based language models often achieve strong results on mathematical reasoning benchmarks while remaining fragile on basic numerical understanding and arithmetic operations. A central limitation is that numbers are processed as symbolic tokens whose embeddings do not explicitly encode numerical value, leading to systematic errors. We introduce a value-aware numerical representation that augments standard tokenized inputs with a dedicated prefix token whose embedding is explicitly conditioned on the underlying numerical value. This mechanism injects magnitude information directly into the model's input space while remaining compatible with existing tokenizers and decoder-only Transformer architectures. Evaluation on arithmetic tasks shows that the proposed approach outperforms baselines across numerical formats, tasks, and operand lengths. These results indicate that explicitly encoding numerical value is an effective and efficient way to improve fundamental numerical robustness in language models.

Andreea Dutulescu, Stefan Ruseti, Mihai Dascalu• 2026

Related benchmarks

TaskDatasetResultRank
Numerical GenerationNUPA
Exact Match83.3
28
Numerical ReasoningNUPA (aggregated)
Exact Match72.4
4
Showing 2 of 2 rows

Other info

Follow for update