Our new X account is live! Follow @wizwand_team for updates
WorkDL logo mark

Sequence Repetition Enhances Token Embeddings and Improves Sequence Labeling with Decoder-only Language Models

About

Modern language models (LMs) are trained in an autoregressive manner, conditioned only on the prefix. In contrast, sequence labeling (SL) tasks assign labels to each individual input token, naturally benefiting from bidirectional context. This discrepancy has historically led SL to rely on inherently bidirectional encoder-only models. However, the rapid development of decoder-only models has raised the question of whether they can be adapted to SL. While causal mask removal has emerged as a viable technique for adapting decoder-only models to leverage the full context for SL, it requires considerable changes to the base model functionality. In this work, we explore sequence repetition (SR) as a less invasive alternative for enabling bidirectionality in decoder-only models. Through fine-tuning experiments, we show that SR inherently makes decoders bidirectional, improving the quality of token-level embeddings and surpassing encoders and unmasked decoders. Contrary to earlier claims, we find that increasing the number of repetitions does not degrade SL performance. Finally, we demonstrate that embeddings from intermediate layers are highly effective for SR, comparable to those from final layers, while being significantly more efficient to compute. Our findings underscore that SR alleviates the structural limitations of decoders, enabling more efficient and adaptable LMs and broadening their applicability to other token-level tasks.

Matija Luka Kuki\'c, Marko \v{C}uljak, David Duki\'c, Martin Tutek, Jan \v{S}najder• 2026

Related benchmarks

TaskDatasetResultRank
Sequence LabelingNLU++ (test)
Micro F181.29
23
Sequence LabelingACE05 (test)
Micro F177.79
23
Sequence TaggingCoNLL 2003 (test)
Micro F193.79
23
Sequence LabelingRest14 (test)
Micro F183.51
23
Showing 4 of 4 rows

Other info

Follow for update