Share your thoughts, 1 month free Claude Pro on usSee more
WorkDL logo mark

Revisiting Batch Normalization for Training Low-latency Deep Spiking Neural Networks from Scratch

About

Spiking Neural Networks (SNNs) have recently emerged as an alternative to deep learning owing to sparse, asynchronous and binary event (or spike) driven processing, that can yield huge energy efficiency benefits on neuromorphic hardware. However, training high-accuracy and low-latency SNNs from scratch suffers from non-differentiable nature of a spiking neuron. To address this training issue in SNNs, we revisit batch normalization and propose a temporal Batch Normalization Through Time (BNTT) technique. Most prior SNN works till now have disregarded batch normalization deeming it ineffective for training temporal SNNs. Different from previous works, our proposed BNTT decouples the parameters in a BNTT layer along the time axis to capture the temporal dynamics of spikes. The temporally evolving learnable parameters in BNTT allow a neuron to control its spike rate through different time-steps, enabling low-latency and low-energy training from scratch. We conduct experiments on CIFAR-10, CIFAR-100, Tiny-ImageNet and event-driven DVS-CIFAR10 datasets. BNTT allows us to train deep SNN architectures from scratch, for the first time, on complex datasets with just few 25-30 time-steps. We also propose an early exit algorithm using the distribution of parameters in BNTT to reduce the latency at inference, that further improves the energy-efficiency.

Youngeun Kim, Priyadarshini Panda• 2020

Related benchmarks

TaskDatasetResultRank
ClassificationCIFAR10-DVS
Accuracy63.2
145
Continuous ControlMuJoCo Walker2d v4
Normalized Performance73.78
34
Continuous ControlMuJoCo Hopper v4
Normalized Performance3.52e+3
28
Continuous ControlMuJoCo Ant v4--
24
Reinforcement LearningMuJoCo v4 (test)
Avg Return (Ant-v4)5.24e+3
11
Continuous ControlMuJoCo IDP v4
Max Average Return9.35e+3
10
Continuous ControlMuJoCo Suite Aggregate
Average Performance Gain (APG)1.62
10
Showing 7 of 7 rows

Other info

Follow for update