Deep learning with spiking neural networks.
Project description
Introduction
snnTorch is a Python package for performing gradient-based learning with spiking neural networks. Rather than reinventing the wheel, it sits on top of PyTorch and takes advantage of its GPU accelerated tensor computation. Pre-designed spiking neuron models are seamlessly integrated within the PyTorch framework and can be treated as recurrent activation units.
snnTorch Structure
snnTorch contains the following components:
Component |
Description |
---|---|
a spiking neuron library like torch.nn, deeply integrated with autograd |
|
variations of backpropagation commonly used with SNNs |
|
a library for spike generation and data conversion |
|
visualization tools for spike-based data using matplotlib and celluloid |
|
optional surrogate gradient functions |
|
dataset utility functions |
snnTorch is designed to be intuitively used with PyTorch, as though each spiking neuron were simply another activation in a sequence of layers. It is therefore agnostic to fully-connected layers, convolutional layers, residual connections, etc.
At present, the neuron models are represented by recursive functions which removes the need to store membrane potential traces for all neurons in a system in order to calculate the gradient. The lean requirements of snnTorch enable small and large networks to be viably trained on CPU, where needed. Provided that the network models and tensors are loaded onto CUDA, snnTorch takes advantage of GPU acceleration in the same way as PyTorch.
Citation
Under preparation.
Requirements
The following packages need to be installed to use snnTorch:
torch >= 1.2.0
numpy >= 1.17
pandas
matplotlib
math
celluloid
Installation
Run the following to install:
$ python $ pip install snntorch
To install snnTorch from source instead:
$ git clone https://github.com/jeshraghian/snnTorch $ cd snnTorch $ python setup.py install
API & Examples
A complete API is available here. Examples, tutorials and Colab notebooks are provided.
Getting Started
Here are a few ways you can get started with snnTorch:
Contributing
If you’re ready to contribute to snnTorch, instructions to do so can be found here.
Acknowledgments
snnTorch was initially developed by Jason K. Eshraghian in the Lu Group (University of Michigan).
Additional contributions were made by Xinxin Wang, Vincent Sun, and Emre Neftci.
Several features in snnTorch were inspired by the work of Friedemann Zenke, Emre Neftci, Doo Seok Jeong, Sumit Bam Shrestha and Garrick Orchard.
License & Copyright
snnTorch is licensed under the GNU General Public License v3.0: https://www.gnu.org/licenses/gpl-3.0.en.html.
History
0.1.2 (2021-02-11)
Alpha-1 release.
0.0.1 (2021-01-20)
First release on PyPI.
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Hashes for snntorch-0.2.3-py2.py3-none-any.whl
Algorithm | Hash digest | |
---|---|---|
SHA256 | 42b982b2875dee1e7db6aaa2f6eb6809ce444bbf97d58d35d6ae69e017d7a196 |
|
MD5 | 0d1959c6132bc8117f3413380de9f3a0 |
|
BLAKE2b-256 | bab557913069b3d8d5e49e1d5c73382747c577c384321f9250c112883c089656 |