Our new X account is live! Follow @wizwand_team for updates
WorkDL logo mark

SpanNorm: Reconciling Training Stability and Performance in Deep Transformers

About

The success of Large Language Models (LLMs) hinges on the stable training of deep Transformer architectures. A critical design choice is the placement of normalization layers, leading to a fundamental trade-off: the ``PreNorm'' architecture ensures training stability at the cost of potential performance degradation in deep models, while the ``PostNorm'' architecture offers strong performance but suffers from severe training instability. In this work, we propose SpanNorm, a novel technique designed to resolve this dilemma by integrating the strengths of both paradigms. Structurally, SpanNorm establishes a clean residual connection that spans the entire transformer block to stabilize signal propagation, while employing a PostNorm-style computation that normalizes the aggregated output to enhance model performance. We provide a theoretical analysis demonstrating that SpanNorm, combined with a principled scaling strategy, maintains bounded signal variance throughout the network, preventing the gradient issues that plague PostNorm models, and also alleviating the representation collapse of PreNorm. Empirically, SpanNorm consistently outperforms standard normalization schemes in both dense and Mixture-of-Experts (MoE) scenarios, paving the way for more powerful and stable Transformer architectures.

Chao Wang, Bei Li, Jiaqi Zhang, Xinyu Liu, Yuchun Fan, Linkun Lyu, Xin Chen, Jingang Wang, Tong Xiao, Peng Pei, Xunliang Cai• 2026

Related benchmarks

TaskDatasetResultRank
Commonsense ReasoningHellaSwag
Accuracy68.1
1460
Commonsense ReasoningWinoGrande
Accuracy64.2
776
Commonsense ReasoningPIQA
Accuracy76.2
647
Language ModelingWikiText
PPL11.4
479
Question AnsweringSciQ
Accuracy92
226
Language ModelingLAMBADA
Accuracy64.2
183
Question AnsweringARC Challenge
Normalized Accuracy38.6
17
Language Model Evaluation SuiteLM Evaluation Harness
Avg Accuracy66.6
8
Showing 8 of 8 rows

Other info

Follow for update