Skip to main content

Write a PREreview

A Practical Tutorial on Spiking Neural Networks: Comprehensive Review, Models, Experiments, Software Tools, and Implementation Guidelines

Posted
Server
Preprints.org
DOI
10.20944/preprints202509.2072.v1

Spiking Neural Networks (SNNs) provide a biologically inspired, event-driven alternative to Artificial Neural Networks (ANNs) with the potential to deliver competitive accuracy at substantially lower energy. This tutorial-study offers a unified, practice-oriented assessment combining critical reviews and standardized experiments. We benchmark a shallow Fully Connected Network (FCN) on MNIST and a deeper VGG7 architecture on CIFAR-10 across multiple neuron models (leaky Integrate-and-Fire (LIF), Sigma-Delta, etc.) and input encodings (direct, rate, temporal, etc.) using supervised surrogate-gradient training, implemented with Intel Lava/SLAYER, SpikingJelly, Norse and PyTorch. Empirically, we observe a consistent but tunable trade-off between accuracy and energy. On MNIST, Sigma-Delta neurons with rate or Sigma-Delta encodings reach 98.1% (ANN: 98.23%). On CIFAR-10, Sigma-Delta neurons with direct input achieve 83.0% at just 2 time steps (ANN: 83.6%). A GPU-based operation-count energy proxy indicates many SNN configurations operate below the ANN energy baseline; some frugal codes minimize energy at the cost of accuracy, whereas accuracy-leaning settings (e.g., Sigma-Delta with direct or rate coding ) narrow the performance gap while remaining energy-conscious, yielding up to 3-fold efficiency versus matched ANNs in our setup. Thresholds and the number of time steps are decisive: intermediate thresholds and the minimal time window that still meets accuracy targets typically maximize efficiency per joule. We distill actionable design rules: choose the neuron/encoding pair by application goal (accuracy-critical vs. energy-constrained) and co-tune thresholds and time steps. Finally, we outline how event-driven neuromorphic hardware can amplify these savings through sparse, local, asynchronous computation, providing a practical playbook for embedded, real-time, and sustainable AI deployments.

You can write a PREreview of A Practical Tutorial on Spiking Neural Networks: Comprehensive Review, Models, Experiments, Software Tools, and Implementation Guidelines. A PREreview is a review of a preprint and can vary from a few sentences to a lengthy report, similar to a journal-organized peer-review report.

Before you start

We will ask you to log in with your ORCID iD. If you don’t have an iD, you can create one.

What is an ORCID iD?

An ORCID iD is a unique identifier that distinguishes you from everyone with the same or similar name.

Start now