Our new X account is live! Follow @wizwand_team for updates
WorkDL logo mark

Sparse Axonal and Dendritic Delays Enable Competitive SNNs for Keyword Classification

About

Training transmission delays in spiking neural networks (SNNs) has been shown to substantially improve their performance on complex temporal tasks. In this work, we show that learning either axonal or dendritic delays enables deep feedforward SNNs composed of leaky integrate-and-fire (LIF) neurons to reach accuracy comparable to existing synaptic delay learning approaches, while significantly reducing memory and computational overhead. SNN models with either axonal or dendritic delays achieve up to $95.58\%$ on the Google Speech Command (GSC) and $80.97\%$ on the Spiking Speech Command (SSC) datasets, matching or exceeding prior methods based on synaptic delays or more complex neuron models. By adjusting the delay parameters, we obtain improved performance for synaptic delay learning baselines, strengthening the comparison. We find that axonal delays offer the most favorable trade-off, combining lower buffering requirements with slightly higher accuracy than dendritic delays. We further show that the performance of axonal and dendritic delay models is largely preserved under strong delay sparsity, with as few as $20\%$ of delays remaining active, further reducing buffering requirements. Overall, our results indicate that learnable axonal and dendritic delays provide a resource-efficient and effective mechanism for temporal representation in SNNs. Code will be made available publicly upon acceptance. Code is available at https://github.com/YounesBouhadjar/AxDenSynDelaySNN

Younes Bouhadjar, Emre Neftci• 2026

Related benchmarks

TaskDatasetResultRank
Keyword ClassificationSSC v1.0 (test)
Number of Spikes1.74e+4
12
Keyword ClassificationGSC v1.0 (test)
Number of Spikes1.27e+4
12
Showing 2 of 2 rows

Other info

Follow for update