Share your thoughts, 1 month free Claude Pro on usSee more
WorkDL logo mark

Native Hybrid Attention for Efficient Sequence Modeling

About

Transformers excel at sequence modeling but face quadratic complexity, while linear attention offers improved efficiency but often compromises recall accuracy over long contexts. In this work, we introduce Native Hybrid Attention (NHA), a novel hybrid architecture of linear and full attention that integrates both intra & inter-layer hybridization into a unified layer design. NHA maintains long-term context in key-value slots updated by a linear RNN, and augments them with short-term tokens from a sliding window. A single softmax attention operation is then applied over all keys and values, enabling per-token and per-head context-dependent weighting without requiring additional fusion parameters. The inter-layer behavior is controlled through a single hyperparameter, the sliding window size, which allows smooth adjustment between purely linear and full attention while keeping all layers structurally uniform. Experimental results show that NHA surpasses Transformers and other hybrid baselines on recall-intensive and commonsense reasoning tasks. Furthermore, pretrained LLMs can be structurally hybridized with NHA, achieving competitive accuracy while delivering significant efficiency gains. Code is available at https://github.com/JusenD/NHA.

Jusen Du, Jiaxi Hu, Tao Zhang, Weigao Sun, Yu Cheng• 2025

Related benchmarks

TaskDatasetResultRank
Commonsense ReasoningWinoGrande
Accuracy73.8
1085
Instruction FollowingIFEval
IFEval Accuracy30.94
625
Question AnsweringARC Easy--
597
Mathematical ReasoningMathQA--
305
Sentence CompletionHellaSwag
Accuracy79.1
276
Word PredictionLAMBADA
Accuracy68.85
148
Question AnsweringARC Challenge
Accuracy (ARC)56.4
142
Question AnsweringPubMedQA (test)--
128
Physical ReasoningPIQA
Accuracy80.96
74
Recall-intensive retrievalRecall-intensive retrieval tasks SWDE, SQUADE, FDA, Trivial QA, NQ, Drop
Performance on SWDE67.67
24
Showing 10 of 16 rows

Other info

Follow for update