Our new X account is live! Follow @wizwand_team for updates
WorkDL logo mark

NEXUS: Bit-Exact ANN-to-SNN Equivalence via Neuromorphic Gate Circuits with Surrogate-Free Training

About

Spiking Neural Networks (SNNs) promise energy-efficient computing through event-driven sparsity, yet all existing approaches sacrifice accuracy by approximating continuous values with discrete spikes. We propose NEXUS, a framework that achieves bit-exact ANN-to-SNN equivalence -- not approximate, but mathematically identical outputs. Our key insight is constructing all arithmetic operations, both linear and nonlinear, from pure IF neuron logic gates that implement IEEE-754 compliant floating-point arithmetic. Through spatial bit encoding (zero encoding error by construction), hierarchical neuromorphic gate circuits (from basic logic gates to complete transformer layers), and surrogate-free STE training (exact identity mapping rather than heuristic approximation), NEXUS produces outputs identical to standard ANNs up to machine precision. Experiments on models up to LLaMA-2 70B demonstrate identical task accuracy (0.00% degradation) with mean ULP error of only 6.19, while achieving 27-168,000$\times$ energy reduction on neuromorphic hardware. Crucially, spatial bit encoding's single-timestep design renders the framework inherently immune to membrane potential leakage (100% accuracy across all decay factors $\beta\in[0.1,1.0]$), while tolerating synaptic noise up to $\sigma=0.2$ with >98% gate-level accuracy.

Zhengzheng Tang• 2026

Related benchmarks

TaskDatasetResultRank
Commonsense ReasoningHellaSwag
Accuracy86.9
1460
Multitask Language UnderstandingMMLU
Accuracy65.4
206
ReasoningARC
Accuracy67.2
83
Question AnsweringTruthfulQA
Accuracy68.26
73
Energy Efficiency AnalysisNeural Network Operations and Components Suite
Energy (nJ) - Loihi4.00e-4
33
Language ModelingWikiText-2 LLaMA-2 7B--
3
Showing 6 of 6 rows

Other info

Follow for update