Our new X account is live! Follow @wizwand_team for updates
WorkDL logo mark

Focus Your Attention (with Adaptive IIR Filters)

About

We present a new layer in which dynamic (i.e.,input-dependent) Infinite Impulse Response (IIR) filters of order two are used to process the input sequence prior to applying conventional attention. The input is split into chunks, and the coefficients of these filters are determined based on previous chunks to maintain causality. Despite their relatively low order, the causal adaptive filters are shown to focus attention on the relevant sequence elements. The new layer is grounded in control theory, and is shown to generalize diagonal state-space layers. The layer performs on-par with state-of-the-art networks, with a fraction of their parameters and with time complexity that is sub-quadratic with input size. The obtained layer is favorable to layers such as Heyna, GPT2, and Mega, both with respect to the number of parameters and the obtained level of performance on multiple long-range sequence problems.

Shahar Lutati, Itamar Zimerman, Lior Wolf• 2023

Related benchmarks

TaskDatasetResultRank
Sequential Image ClassificationPMNIST (test)
Accuracy (Test)98.8
77
Language ModelEnwiki8
BPC0.94
23
Image ClassificationS-MNIST (test)
Average Accuracy99.7
18
Character-level Language Modelingtext8
BPC0.98
16
Associative RecallAssociative Recall L=30 (test)
Accuracy100
5
Associative RecallAssociative Recall L=1K (test)
Associative Recall Accuracy100
5
Associative RecallAssociative Recall L=32K (test)
Accuracy100
4
Associative RecallAssociative Recall L=64K (test)
Accuracy100
4
Associative RecallAssociative Recall L=8K (test)
Accuracy100
3
Showing 9 of 9 rows

Other info

Follow for update