Skip to main content

Deep learning with spiking neural networks on IPUs.

Project description

https://github.com/jeshraghian/snntorch/actions/workflows/build.yml/badge.svg Documentation Status Discord https://img.shields.io/pypi/v/snntorch.svg https://img.shields.io/conda/vn/conda-forge/snntorch.svg https://static.pepy.tech/personalized-badge/snntorch?period=total&units=international_system&left_color=grey&right_color=orange&left_text=Downloads https://github.com/jeshraghian/snntorch/blob/master/docs/_static/img/snntorch_alpha_scaled.png?raw=true

The brain is the perfect place to look for inspiration to develop more efficient neural networks. One of the main differences with modern deep learning is that the brain encodes information in spikes rather than continuous activations. snnTorch is a Python package for performing gradient-based learning with spiking neural networks. It extends the capabilities of PyTorch, taking advantage of its GPU accelerated tensor computation and applying it to networks of spiking neurons. Pre-designed spiking neuron models are seamlessly integrated within the PyTorch framework and can be treated as recurrent activation units.

https://github.com/jeshraghian/snntorch/blob/master/docs/_static/img/spike_excite_alpha_ps2.gif?raw=true

If you like this project, please consider starring ⭐ this repo as it is the easiest and best way to support it.

If you have issues, comments, or are looking for advice on training spiking neural networks, you can open an issue, a discussion, or chat in our discord channel.

snnTorch Structure

snnTorch contains the following components:

Component

Description

snntorch

a spiking neuron library like torch.nn, deeply integrated with autograd

snntorch.backprop

variations of backpropagation commonly used with SNNs

snntorch.functional

common arithmetic operations on spikes, e.g., loss, regularization etc.

snntorch.spikegen

a library for spike generation and data conversion

snntorch.spikeplot

visualization tools for spike-based data using matplotlib and celluloid

snntorch.spikevision

contains popular neuromorphic datasets

snntorch.surrogate

optional surrogate gradient functions

snntorch.utils

dataset utility functions

snnTorch is designed to be intuitively used with PyTorch, as though each spiking neuron were simply another activation in a sequence of layers. It is therefore agnostic to fully-connected layers, convolutional layers, residual connections, etc.

At present, the neuron models are represented by recursive functions which removes the need to store membrane potential traces for all neurons in a system in order to calculate the gradient. The lean requirements of snnTorch enable small and large networks to be viably trained on CPU, where needed. Provided that the network models and tensors are loaded onto CUDA, snnTorch takes advantage of GPU acceleration in the same way as PyTorch.

Citation

If you find snnTorch useful in your work, please cite the following source:

Jason K. Eshraghian, Max Ward, Emre Neftci, Xinxin Wang, Gregor Lenz, Girish Dwivedi, Mohammed Bennamoun, Doo Seok Jeong, and Wei D. Lu “Training Spiking Neural Networks Using Lessons From Deep Learning”. arXiv preprint arXiv:2109.12894, September 2021.

@article{eshraghian2021training,
        title   =  {Training spiking neural networks using lessons from deep learning},
        author  =  {Eshraghian, Jason K and Ward, Max and Neftci, Emre and Wang, Xinxin
                    and Lenz, Gregor and Dwivedi, Girish and Bennamoun, Mohammed and
                   Jeong, Doo Seok and Lu, Wei D},
        journal = {arXiv preprint arXiv:2109.12894},
        year    = {2021}
}

Let us know if you are using snnTorch in any interesting work, research or blogs, as we would love to hear more about it! Reach out at snntorch@gmail.com.

Requirements

The following packages need to be installed to use snnTorch:

  • torch >= 1.1.0

  • numpy >= 1.17

  • pandas

  • matplotlib

  • math

They are automatically installed if snnTorch is installed using the pip command. Ensure the correct version of torch is installed for your system to enable CUDA compatibility.

Installation

Run the following to install:

$ python
$ pip install snntorch

To install snnTorch from source instead:

$ git clone https://github.com/jeshraghian/snnTorch
$ cd snntorch
$ python setup.py install

To install snntorch with conda:

$ conda install -c conda-forge snntorch

API & Examples

A complete API is available here. Examples, tutorials and Colab notebooks are provided.

Quickstart

Open In Colab

Here are a few ways you can get started with snnTorch:

For a quick example to run snnTorch, see the following snippet, or test the quickstart notebook:

import torch, torch.nn as nn
import snntorch as snn
from snntorch import surrogate

num_steps = 25 # number of time steps
batch_size = 1
beta = 0.5  # neuron decay rate
spike_grad = surrogate.fast_sigmoid()

net = nn.Sequential(
      nn.Conv2d(1, 8, 5),
      nn.MaxPool2d(2),
      snn.Leaky(beta=beta, init_hidden=True, spike_grad=spike_grad),
      nn.Conv2d(8, 16, 5),
      nn.MaxPool2d(2),
      snn.Leaky(beta=beta, init_hidden=True, spike_grad=spike_grad),
      nn.Flatten(),
      nn.Linear(16 * 4 * 4, 10),
      snn.Leaky(beta=beta, init_hidden=True, spike_grad=spike_grad, output=True)
      )

# random input data
data_in = torch.rand(num_steps, batch_size, 1, 28, 28)

spike_recording = []

for step in range(num_steps):
    spike, state = net(data_in[step])
    spike_recording.append(spike)

If you’re feeling lazy and want the training process to be taken care of:

import snntorch.functional as SF
from snntorch import backprop

# correct class should fire 80% of the time
loss_fn = SF.mse_count_loss(correct_rate=0.8, incorrect_rate=0.2)
optimizer = torch.optim.Adam(net.parameters(), lr=1e-3, betas=(0.9, 0.999))

# train for one epoch using the backprop through time algorithm
# assume train_loader is a DataLoader with time-varying input
avg_loss = backprop.BPTT(net, train_loader, optimizer=optimizer,
                        num_steps=num_steps, criterion=loss_fn)

A Deep Dive into SNNs

If you wish to learn all the fundamentals of training spiking neural networks, from neuron models, to the neural code, up to backpropagation, the snnTorch tutorial series is a great place to begin. It consists of interactive notebooks with complete explanations that can get you up to speed.

Tutorial

Title

Colab Link

Tutorial 1

Spike Encoding with snnTorch

Open In Colab

Tutorial 2

The Leaky Integrate and Fire Neuron

Open In Colab

Tutorial 3

A Feedforward Spiking Neural Network

Open In Colab

Tutorial 4

2nd Order Spiking Neuron Models (Optional)

Open In Colab

Tutorial 5

Training Spiking Neural Networks with snnTorch

Open In Colab

Tutorial 6

Surrogate Gradient Descent in a Convolutional SNN

Open In Colab

Tutorial 7

Neuromorphic Datasets with Tonic + snnTorch

Open In Colab

Advanced Tutorials

Colab Link

Population Coding

Open In Colab

Contributing

If you’re ready to contribute to snnTorch, instructions to do so can be found here.

Acknowledgments

snnTorch was initially developed by Jason K. Eshraghian in the Lu Group (University of Michigan).

Additional contributions were made by Xinxin Wang, Vincent Sun, and Emre Neftci.

Several features in snnTorch were inspired by the work of Friedemann Zenke, Emre Neftci, Doo Seok Jeong, Sumit Bam Shrestha and Garrick Orchard.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

snntorch-ipu-0.5.18.tar.gz (23.2 MB view details)

Uploaded Source

Built Distribution

snntorch_ipu-0.5.18-py2.py3-none-any.whl (93.6 kB view details)

Uploaded Python 2 Python 3

File details

Details for the file snntorch-ipu-0.5.18.tar.gz.

File metadata

  • Download URL: snntorch-ipu-0.5.18.tar.gz
  • Upload date:
  • Size: 23.2 MB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/4.0.1 CPython/3.10.4

File hashes

Hashes for snntorch-ipu-0.5.18.tar.gz
Algorithm Hash digest
SHA256 73fc881971716264cb191d71525fa366b7a2280e46c279cc56d101587bdecfd7
MD5 98a6bbc8c906975dc4a63f7733082096
BLAKE2b-256 60d2fe28a5cc894fbb6d25cdcea2bb5ae786a26cb0eae7516ccadfb107cf898c

See more details on using hashes here.

File details

Details for the file snntorch_ipu-0.5.18-py2.py3-none-any.whl.

File metadata

File hashes

Hashes for snntorch_ipu-0.5.18-py2.py3-none-any.whl
Algorithm Hash digest
SHA256 4a2d496d2ed10b0fff0aac786e26c5e5dbc8e152b73ed287e70f3852ddf5d641
MD5 feb6c9e2d24898e4d704a01831a3ca2b
BLAKE2b-256 b84bf1ae5575719adf3ac6393ae2ec1adde4c6a43a57c10eba012def97b2c972

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page