NEXUS: Bit-Exact ANN-to-SNN Equivalence via Neuromorphic Gate Circuits with Surrogate-Free Training
About
Spiking Neural Networks (SNNs) promise energy-efficient computing through event-driven sparsity, yet all existing approaches sacrifice accuracy by approximating continuous values with discrete spikes. We propose NEXUS, a framework that achieves bit-exact ANN-to-SNN equivalence -- not approximate, but mathematically identical outputs. Our key insight is constructing all arithmetic operations, both linear and nonlinear, from pure IF neuron logic gates that implement IEEE-754 compliant floating-point arithmetic. Through spatial bit encoding (zero encoding error by construction), hierarchical neuromorphic gate circuits (from basic logic gates to complete transformer layers), and surrogate-free STE training (exact identity mapping rather than heuristic approximation), NEXUS produces outputs identical to standard ANNs up to machine precision. Experiments on models up to LLaMA-2 70B demonstrate identical task accuracy (0.00% degradation) with mean ULP error of only 6.19, while achieving 27-168,000$\times$ energy reduction on neuromorphic hardware. Crucially, spatial bit encoding's single-timestep design renders the framework inherently immune to membrane potential leakage (100% accuracy across all decay factors $\beta\in[0.1,1.0]$), while tolerating synaptic noise up to $\sigma=0.2$ with >98% gate-level accuracy.
Related benchmarks
| Task | Dataset | Result | Rank | |
|---|---|---|---|---|
| Commonsense Reasoning | HellaSwag | Accuracy86.9 | 1460 | |
| Multitask Language Understanding | MMLU | Accuracy65.4 | 206 | |
| Reasoning | ARC | Accuracy67.2 | 83 | |
| Question Answering | TruthfulQA | Accuracy68.26 | 73 | |
| Energy Efficiency Analysis | Neural Network Operations and Components Suite | Energy (nJ) - Loihi4.00e-4 | 33 | |
| Language Modeling | WikiText-2 LLaMA-2 7B | -- | 3 |