Share your thoughts, 1 month free Claude Pro on usSee more
WorkDL logo mark

Delays in Spiking Neural Networks: A State Space Model Approach

About

Spiking neural networks (SNNs) are biologically inspired, event-driven models suited for temporal data processing and energy-efficient neuromorphic computing. In SNNs, richer neuronal dynamic allows capturing more complex temporal dependencies, with delays playing a crucial role by allowing past inputs to directly influence present spiking behavior. We propose a general framework for incorporating delays into SNNs through additional state variables. The proposed mechanism enables each neuron to access a finite temporal input history. The framework is agnostic to neuron models and hence can be seamlessly integrated into standard spiking neuron models such as Leaky Integrate-and-Fire (LIF) and Adaptive LIF (adLIF). We analyze how the duration of the delays and the learnable parameters associated with them affect the performance. We investigate the trade-offs in the network architecture due to additional state variables introduced by the delay mechanism. Experiments on the Spiking Heidelberg Digits (SHD) dataset show that the proposed mechanism matches existing delay-based SNNs in performance while remaining computationally efficient, with particular gains in smaller networks.

Sanja Karilanova, Subhrakanti Dey, Ay\c{c}a \"Oz\c{c}elikkale• 2025

Related benchmarks

TaskDatasetResultRank
Sequence ClassificationSpiking Heidelberg Dataset (SHD) (test)--
4
Showing 1 of 1 rows

Other info

Follow for update