Share your thoughts, 1 month free Claude Pro on usSee more
WorkDL logo mark

Unveiling Simplicities of Attention: Adaptive Long-Context Head Identification

About

The ability to process long contexts is crucial for many natural language processing tasks, yet it remains a significant challenge. While substantial progress has been made in enhancing the efficiency of attention mechanisms, there is still a gap in understanding how attention heads function in long-context settings. In this paper, we observe that while certain heads consistently attend to local information only, others swing between attending to local and long-context information depending on the query. This raises the question: can we identify which heads require long-context information to predict the next token accurately? We demonstrate that it's possible to predict which heads are crucial for long-context processing using only local keys. The core idea here is to exploit a simple model for the long-context scores via second moment approximations. These findings unveil simple properties of attention in the context of long sequences, and open the door to potentially significant gains in efficiency.

Konstantin Donhauser, Charles Arnal, Mohammad Pezeshki, Vivien Cabannes, David Lopez-Paz, Kartik Ahuja• 2025

Related benchmarks

TaskDatasetResultRank
Long-context Language UnderstandingLongBench
Average Score43.9
86
Information RetrievalNIAH (test)
Average Score95.2
59
Showing 2 of 2 rows

Other info

Follow for update