Share your thoughts, 1 month free Claude Pro on usSee more
WorkDL logo mark

Transformers are Multi-State RNNs

About

Transformers are considered conceptually different from the previous generation of state-of-the-art NLP models - recurrent neural networks (RNNs). In this work, we demonstrate that decoder-only transformers can in fact be conceptualized as unbounded multi-state RNNs - an RNN variant with unlimited hidden state size. We further show that transformers can be converted into $\textit{bounded}$ multi-state RNNs by fixing the size of their hidden state, effectively compressing their key-value cache. We introduce a novel, training-free compression policy - $\textbf{T}$oken $\textbf{O}$mission $\textbf{V}$ia $\textbf{A}$ttention (TOVA). Our experiments with four long range tasks and several LLMs show that TOVA outperforms several baseline compression policies. Particularly, our results are nearly on par with the full model, using in some cases only $\frac{1}{8}$ of the original cache size, which translates to 4.8X higher throughput. Our results shed light on the connection between transformers and RNNs, and help mitigate one of LLMs' most painful computational bottlenecks - the size of their key-value cache. We publicly release our code at https://github.com/schwartz-lab-NLP/TOVA

Matanel Oren, Michael Hassid, Nir Yarden, Yossi Adi, Roy Schwartz• 2024

Related benchmarks

TaskDatasetResultRank
Language ModelingWikiText-2
Perplexity (PPL)27
1624
Mathematical ReasoningGSM8K
Accuracy60.3
1362
Multi-task Language UnderstandingMMLU
Accuracy49
321
Mathematical ReasoningAIME 2024 (test)
Accuracy36.7
159
Document Question AnsweringQasper
Accuracy38.6
44
Needle-in-a-HaystackNeedle-in-a-haystack 4x original context
Accuracy100
35
Variable TrackingRULER-VT
Accuracy99.7
33
Key-Value RetrievalLITM (Lost in the Middle)
Accuracy8.7
33
Long-context Language UnderstandingLongBench 1 host v1 (test)
2WQA Score41.33
14
Long Text TasksLongBench 4k-length (test)
PR (Zh)5.5
13
Showing 10 of 12 rows

Other info

Follow for update