Our new X account is live! Follow @wizwand_team for updates
WorkDL logo mark

Neuronal Attention Circuit (NAC) for Representation Learning

About

Attention improves representation learning over RNNs, but its discrete nature limits continuous-time (CT) modeling. We introduce Neuronal Attention Circuit (NAC), a novel, biologically inspired CT-Attention mechanism that reformulates attention logit computation as the solution to a linear first-order ODE with nonlinear interlinked gates derived from repurposing C.elegans Neuronal Circuit Policies (NCPs) wiring. NAC replaces dense projections with sparse sensory gates for key-query projections and a sparse backbone network with two heads for computing content-target and learnable time-constant gates, enabling efficient adaptive dynamics. To improve efficiency and memory consumption, we implemented an adaptable subquadratic sparse Top-K pairwise concatenation mechanism that selectively curates key-query interactions. We provide rigorous theoretical guarantees, including state stability and bounded approximation errors. Empirically, we implemented NAC in diverse domains, including irregular time-series classification, lane-keeping for autonomous vehicles, and industrial prognostics. We observed that NAC matches or outperforms competing baselines in accuracy and occupies an intermediate position in runtime and memory consumption compared with several CT state-of-the-art baselines, while being interpretable at the neuron cell level.

Waleed Razzaq, Izis Kanjaraway, Yun-Bo Zhao• 2025

Related benchmarks

TaskDatasetResultRank
Degradation EstimationPRONOSTIA
MSE37.5
33
Degradation EstimationXJTU-SY
MSE22.87
33
Degradation EstimationHUST
MSE27.82
33
Irregular Time Series ClassificationE-MNIST
Accuracy96.64
33
Irregular Time Series ClassificationPAR
Accuracy90.18
33
Lane-Keeping Trajectory PredictionUdacity Simulator
MSE0.017
33
Lane-Keeping Action ClassificationOpenAI CarRacing
Accuracy80.72
33
Showing 7 of 7 rows

Other info

Follow for update