Share your thoughts, 1 month free Claude Pro on usSee more
WorkDL logo mark

Advancing Training Efficiency of Deep Spiking Neural Networks through Rate-based Backpropagation

About

Recent insights have revealed that rate-coding is a primary form of information representation captured by surrogate-gradient-based Backpropagation Through Time (BPTT) in training deep Spiking Neural Networks (SNNs). Motivated by these findings, we propose rate-based backpropagation, a training strategy specifically designed to exploit rate-based representations to reduce the complexity of BPTT. Our method minimizes reliance on detailed temporal derivatives by focusing on averaged dynamics, streamlining the computational graph to reduce memory and computational demands of SNNs training. We substantiate the rationality of the gradient approximation between BPTT and the proposed method through both theoretical analysis and empirical observations. Comprehensive experiments on CIFAR-10, CIFAR-100, ImageNet, and CIFAR10-DVS validate that our method achieves comparable performance to BPTT counterparts, and surpasses state-of-the-art efficient training techniques. By leveraging the inherent benefits of rate-coding, this work sets the stage for more scalable and efficient SNNs training within resource-constrained environments. Our code is available at https://github.com/Tab-ct/rate-based-backpropagation.

Chengting Yu, Lei Liu, Gaoang Wang, Erping Li, Aili Wang• 2024

Related benchmarks

TaskDatasetResultRank
Image ClassificationCIFAR-100 (test)--
3518
Image ClassificationCIFAR-10 (test)--
3381
Image ClassificationCIFAR100
Accuracy80.71
347
Image ClassificationImageNet (val)--
300
ClassificationCIFAR10-DVS--
145
Image ClassificationCIFAR10
Top-1 Accuracy96.26
112
Image ClassificationCIFAR10-DVS (test)
Accuracy76.96
90
Image ClassificationImageNet single-crop (val)
Top-1 Accuracy70.01
13
Showing 8 of 8 rows

Other info

Code

Follow for update