Share your thoughts, 1 month free Claude Pro on usSee more
WorkDL logo mark

Efficient Event-based Delay Learning in Spiking Neural Networks

About

Spiking Neural Networks (SNNs) compute using sparse communication and are attracting increased attention as a more energy-efficient alternative to traditional Artificial Neural Networks~(ANNs). While standard ANNs are stateless, spiking neurons are stateful and hence intrinsically recurrent, making them well-suited for spatio-temporal tasks. However, the duration of this intrinsic memory is limited by synaptic and membrane time constants. Delays are a powerful additional mechanism and, in this paper, we propose a novel event-based training method for SNNs with delays, grounded in the EventProp formalism which enables the calculation of exact gradients with respect to weights and delays. Our method supports multiple spikes per neuron and, to the best of our knowledge, is the first delay learning algorithm to be applied to recurrent SNNs. We evaluate our method on a simple sequence detection task, as well as the Yin-Yang, Spiking Heidelberg Digits, Spiking Speech Commands and Braille letter reading datasets, demonstrating that our algorithm can optimise delays from suboptimal initial conditions and enhance classification accuracy compared to architectures without delays. We also find that recurrent delays are particularly beneficial in small networks. Finally, we show that our approach uses less than half the memory of the current state-of-the-art delay-learning method and is up to 26x faster.

Bal\'azs M\'esz\'aros, James C. Knight, Thomas Nowotny• 2025

Related benchmarks

TaskDatasetResultRank
ClassificationSHD
Accuracy93.1
36
ClassificationSSC
Top-1 Accuracy76.1
22
SNN TrainingSHD 700 channels (full)
Epoch Training Time (s)38.93
6
ClassificationSHD (val 20% of train set)
Accuracy93.2
6
Showing 4 of 4 rows

Other info

Follow for update