Skip to main content

A library for deep learning with spiking neural networks

Project description

A deep learning library for spiking neural networks.

Test status chat on Discord DOI

Norse aims to exploit the advantages of bio-inspired neural components, which are sparse and event-driven - a fundamental difference from artificial neural networks. Norse expands PyTorch with primitives for bio-inspired neural components, bringing you two advantages: a modern and proven infrastructure based on PyTorch and deep learning-compatible spiking neural network components.

Documentation: norse.github.io/norse/

1. Getting started

The fastest way to try Norse is via the jupyter notebooks on Google collab.

Alternatively, you can install Norse locally and run one of the included tasks such as MNIST:

python -m norse.task.mnist

2. Using Norse

Norse presents plug-and-play components for deep learning with spiking neural networks. Here, we describe how to install Norse and start to apply it in your own work. Read more in our documentation.

2.1. Installation

We assume you are using Python version 3.7+ and have installed PyTorch version 1.9 or higher. Read more about the prerequisites in our documentation.

MethodInstructionsPrerequisites
From PyPi
pip install norse
Pip
From source
pip install -qU git+https://github.com/norse/norse
Pip, PyTorch
With Docker
docker pull quay.io/norse/norse
Docker
From Conda
conda install -c norse norse
Anaconda or Miniconda

For troubleshooting, please refer to our installation guide, create an issue on GitHub or write us on Discord.

2.2. Running examples

Norse is bundled with a number of example tasks, serving as short, self contained, correct examples (SSCCE). They can be run by invoking the norse module from the base directory. More information and tasks are available in our documentation and in your console by typing: python -m norse.task.<task> --help, where <task> is one of the task names.

  • To train an MNIST classification network, invoke
    python -m norse.task.mnist
    
  • To train a CIFAR classification network, invoke
    python -m norse.task.cifar10
    
  • To train the cartpole balancing task with Policy gradient, invoke
    python -m norse.task.cartpole
    

Norse is compatible with PyTorch Lightning, as demonstrated in the PyTorch Lightning MNIST task variant (requires PyTorch lightning):

python -m norse.task.mnist_pl --gpus=4

2.3. Example: Spiking convolutional classifier

Open In Colab

This classifier is a taken from our tutorial on training a spiking MNIST classifier and achieves >99% accuracy.

import torch, torch.nn as nn
from norse.torch import LICell             # Leaky integrator
from norse.torch import LIFCell            # Leaky integrate-and-fire
from norse.torch import SequentialState    # Stateful sequential layers

model = SequentialState(
    nn.Conv2d(1, 20, 5, 1),      # Convolve from 1 -> 20 channels
    LIFCell(),                   # Spiking activation layer
    nn.MaxPool2d(2, 2),
    nn.Conv2d(20, 50, 5, 1),     # Convolve from 20 -> 50 channels
    LIFCell(),
    nn.MaxPool2d(2, 2),
    nn.Flatten(),                # Flatten to 800 units
    nn.Linear(800, 10),
    LICell(),                    # Non-spiking integrator layer
)

data = torch.randn(8, 1, 28, 28) # 8 batches, 1 channel, 28x28 pixels
output, state = model(data)      # Provides a tuple (tensor (8, 10), neuron state)

2.4. Example: Long short-term spiking neural networks

The long short-term spiking neural networks from the paper by G. Bellec, D. Salaj, A. Subramoney, R. Legenstein, and W. Maass (2018) is another interesting way to apply norse:

import torch
from norse.torch import LSNNRecurrent
# Recurrent LSNN network with 2 input neurons and 10 output neurons
layer = LSNNRecurrent(2, 10)
# Generate data: 20 timesteps with 8 datapoints per batch for 2 neurons
data  = torch.zeros(20, 8, 2)
# Tuple of (output spikes of shape (20, 8, 2), layer state)
output, new_state = layer(data)

3. Why Norse?

Norse was created for two reasons: to 1) apply findings from decades of research in practical settings and to 2) accelerate our own research within bio-inspired learning.

We are passionate about Norse: we strive to follow best practices and promise to maintain this library for the simple reason that we depend on it ourselves. We have implemented a number of neuron models, synapse dynamics, encoding and decoding algorithms, dataset integrations, tasks, and examples. Combined with the PyTorch infrastructure and our high coding standards, we have found Norse to be an excellent tool for modelling scaleable experiments and Norse is actively being used in research.

Finally, we are working to keep Norse as performant as possible. Preliminary benchmarks suggest that Norse achieves excellent performance on small networks of up to ~5000 neurons per layer. Aided by the preexisting investment in scalable training and inference with PyTorch, Norse scales from a single laptop to several nodes on an HPC cluster with little effort. As illustrated by our PyTorch Lightning example task.

Read more about Norse in our documentation.

4. Similar work

The list of projects below serves to illustrate the state of the art, while explaining our own incentives to create and use norse.

  • BindsNET also builds on PyTorch and is explicitly targeted at machine learning tasks. It implements a Network abstraction with the typical 'node' and 'connection' notions common in spiking neural network simulators like nest.
  • cuSNN is a C++ GPU-accelerated simulator for large-scale networks. The library focuses on CUDA and includes spike-time dependent plasicity (STDP) learning rules.
  • decolle implements an online learning algorithm described in the paper "Synaptic Plasticity Dynamics for Deep Continuous Local Learning (DECOLLE)" by J. Kaiser, M. Mostafa and E. Neftci.
  • GeNN compiles SNN network models to NVIDIA CUDA to achieve high-performing SNN model simulations.
  • Long short-term memory Spiking Neural Networks (LSNN) is a tool from the University of Graaz for modelling LSNN cells in Tensorflow. The library focuses on a single neuron and gradient model.
  • Nengo is a neuron simulator, and Nengo-DL is a deep learning network simulator that optimised spike-based neural networks based on an approximation method suggested by Hunsberger and Eliasmith (2016).
  • Nengo PyTorch a thin wrapper for PyTorch that adds a single voltage-only spiking model. The approach is independent from the Nengo framework.
  • Neuron Simulation Toolkit (NEST) constructs and evaluates highly detailed simulations of spiking neural networks. This is useful in a medical/biological sense but maps poorly to large datasets and deep learning.
  • PyNN is a Python interface that allows you to define and simulate spiking neural network models on different backends (both software simulators and neuromorphic hardware). It does not currently provide mechanisms for optimisation or arbitrary synaptic plasticity.
  • PySNN is a PyTorch extension similar to Norse. Its approach to model building is slightly different than Norse in that the neurons are stateful.
  • Rockpool is a Python package developed by SynSense for training, simulating and deploying spiking neural networks. It offers both JAX and PyTorch primitives.
  • Sinabs is a PyTorch extension by SynSense. It mainly focuses on convolutions and translation to neuromorphic hardware.
  • SlayerPyTorch is a Spike LAYer Error Reassignment library, that focuses on solutions for the temporal credit problem of spiking neurons and a probabilistic approach to backpropagation errors. It includes support for the Loihi chip.
  • SNN toolbox automates the conversion of pre-trained analog to spiking neural networks. The tool is solely for already trained networks and omits the (possibly platform specific) training.
  • snnTorch is a simulator built on PyTorch, featuring several introduction tutorials on deep learning with SNNs.
  • SpikingJelly is another PyTorch-based spiking neural network simulator. SpikingJelly uses stateful neurons. Example of training a network on MNIST.
  • SpyTorch presents a set of tutorials for training SNNs with the surrogate gradient approach SuperSpike by F. Zenke, and S. Ganguli (2017). Norse implements the SuperSpike surrogate gradient function, but allows for other surrogate gradients and training approaches.
  • s2net is based on the implementation presented in SpyTorch, but implements convolutional layers as well. It also contains a demonstration how to use those primitives to train a model on the Google Speech Commands dataset.

5. Contributing

Contributions are warmly encouraged and always welcome. However, we also have high expectations around the code base so if you wish to contribute, please refer to our contribution guidelines.

6. Credits

Norse is created by

More information about Norse can be found in our documentation. The research has received funding from the EC Horizon 2020 Framework Programme under Grant Agreements 785907 and 945539 (HBP) and by the Deutsche Forschungsgemeinschaft (DFG, German Research Fundation) under Germany's Excellence Strategy EXC 2181/1 - 390900948 (the Heidelberg STRUCTURES Excellence Cluster).

7. Citation

If you use Norse in your work, please cite it as follows:

@software{norse2021,
  author       = {Pehle, Christian and
                  Pedersen, Jens Egholm},
  title        = {{Norse -  A deep learning library for spiking 
                   neural networks}},
  month        = jan,
  year         = 2021,
  note         = {Documentation: https://norse.ai/docs/},
  publisher    = {Zenodo},
  version      = {0.0.7},
  doi          = {10.5281/zenodo.4422025},
  url          = {https://doi.org/10.5281/zenodo.4422025}
}

Norse is actively applied and cited in the literature. We are keeping track of the papers cited by Norse in our documentation.

8. License

LGPLv3. See LICENSE for license details.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

norse-1.0.0.tar.gz (115.6 kB view details)

Uploaded Source

Built Distributions

If you're not sure about the file name format, learn more about wheel file names.

norse-1.0.0-cp39-cp39-macosx_11_0_x86_64.whl (268.2 kB view details)

Uploaded CPython 3.9macOS 11.0+ x86-64

norse-1.0.0-cp38-cp38-win_amd64.whl (277.1 kB view details)

Uploaded CPython 3.8Windows x86-64

norse-1.0.0-cp38-cp38-macosx_10_15_x86_64.whl (267.9 kB view details)

Uploaded CPython 3.8macOS 10.15+ x86-64

norse-1.0.0-cp37-cp37m-macosx_10_15_x86_64.whl (267.6 kB view details)

Uploaded CPython 3.7mmacOS 10.15+ x86-64

File details

Details for the file norse-1.0.0.tar.gz.

File metadata

  • Download URL: norse-1.0.0.tar.gz
  • Upload date:
  • Size: 115.6 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/4.0.2 CPython/3.8.10

File hashes

Hashes for norse-1.0.0.tar.gz
Algorithm Hash digest
SHA256 92f6bd0d97aa519ffcb605a25c4bda1ef1d949de8f87d0450eb1176ee7de69c9
MD5 ac20fcedc9291d4cb1a736bea4a26f4e
BLAKE2b-256 f92978b59bb6b440f7d9d6f8d69bef2ecd1ecbaa0f268317af7a2162750b54fe

See more details on using hashes here.

File details

Details for the file norse-1.0.0-cp39-cp39-macosx_11_0_x86_64.whl.

File metadata

File hashes

Hashes for norse-1.0.0-cp39-cp39-macosx_11_0_x86_64.whl
Algorithm Hash digest
SHA256 0324d64698afd561a8d361cb88bc7b69206a7c250a6fcb79dfc944e06142a346
MD5 3ea5667d713d730d36059deba08644a7
BLAKE2b-256 abb513ad4b84fd3881e8d48683be2d366ced1c9be99d442ae933895e0b4b4b75

See more details on using hashes here.

File details

Details for the file norse-1.0.0-cp38-cp38-win_amd64.whl.

File metadata

  • Download URL: norse-1.0.0-cp38-cp38-win_amd64.whl
  • Upload date:
  • Size: 277.1 kB
  • Tags: CPython 3.8, Windows x86-64
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/4.0.2 CPython/3.8.10

File hashes

Hashes for norse-1.0.0-cp38-cp38-win_amd64.whl
Algorithm Hash digest
SHA256 416075566a60dce199e3e022384134e46751a425f56922e56736882798c974d0
MD5 6f4efd999dc226d2769bd8e89546f530
BLAKE2b-256 ce868f32d063358e0f9f61b9759ec0be197f607ce29b6bc9cd3f6191b3b97b47

See more details on using hashes here.

File details

Details for the file norse-1.0.0-cp38-cp38-macosx_10_15_x86_64.whl.

File metadata

File hashes

Hashes for norse-1.0.0-cp38-cp38-macosx_10_15_x86_64.whl
Algorithm Hash digest
SHA256 da950d96d8ce887c9cdfa8de4f0629c57266390f77db5923c298d57ef69187c7
MD5 133b93431c2058fb94fa61c23af7ed21
BLAKE2b-256 cf36fe4cf7faac2766960b3a496899a612383ce74104e66cce5e0ec6eb2e528b

See more details on using hashes here.

File details

Details for the file norse-1.0.0-cp37-cp37m-macosx_10_15_x86_64.whl.

File metadata

File hashes

Hashes for norse-1.0.0-cp37-cp37m-macosx_10_15_x86_64.whl
Algorithm Hash digest
SHA256 6a9b2cdb2135818154fb157b870b71a9b04beef1c88ac96bb2d808773056ce13
MD5 8d6911a975a06ea56c67d6f4c1614d28
BLAKE2b-256 8ad7d382ddd846686711fa7aac39b61240bcecbecce380f40c98f8ec7fd5bf3e

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page