Share your thoughts, 1 month free Claude Pro on usSee more
WorkDL logo mark

Diffutron: A Masked Diffusion Language Model for Turkish Language

About

Masked Diffusion Language Models (MDLMs) have emerged as a compelling non-autoregressive alternative to standard large language models; however, their application to morphologically rich languages remains limited. In this paper, we introduce $\textit{Diffutron}$, a masked diffusion language model specifically designed for Turkish. Our approach leverages a resource-efficient training pipeline, starting with LoRA-based continual pre-training of a multilingual encoder on a large-scale corpus. To enable generative capabilities, we employ a progressive instruction-tuning strategy, sequentially adapting the model on general and task-specific instruction sets. Experimental results across comprehensive benchmarks demonstrate that, despite its compact size, our model achieves competitive performance compared to existing multi-billion-parameter baselines. These findings validate the effectiveness of masked diffusion modeling combined with multi-stage tuning for non-autoregressive text generation in Turkish.

\c{S}uayp Talha Kocabay, Talha R\"uzgar Akku\c{s}• 2026

Related benchmarks

TaskDatasetResultRank
Machine Reading ComprehensionBELEBELE Target Language
MRC Score27
24
Semantic Textual SimilaritySTSb-TR
STSb Score18.78
8
Irony DetectionIronyTR
Score52
8
Cross-lingual Question AnsweringEXAMS TR
Score27.74
8
News Category ClassificationNews Category Classification
Score32.4
8
Natural Language InferenceMNLI TR
Score33.29
8
Causal ReasoningXCOPA
XCOPA Causal Reasoning Score53.8
8
Showing 7 of 7 rows

Other info

Follow for update