Skip to main content

EVNN: a torch extension for custom event based RNN models.

Project description

EvNN: Event-based Neural Networks

EvNN is a CUDA and C++ implementation of event-based RNN layers with built-in DropConnect and Zoneout regularization. These layers are exposed through C++ and Pytorch APIs for easy integration into your own projects or machine learning frameworks. The code framework and base layers are adopted from Haste Library.

EGRU: Event-based Gated Recurrent Unit

Event based GRU was publised as a conference paper at ICLR 2023: Efficient recurrent architectures through activity sparsity and sparse back-propagation through time (notable-top-25%)

EGRU illustration

Illustration of EGRU. A. A single unit of the original GRU model adapted from Cho et al.. B: EGRU unit with event generating mechanism. C: Heaviside function and surrogate gradient. D: Forward state dynamics for two EGRU units ($i$ and $j$ ). E: Activity-sparse backward dynamics for two EGRU units ($i$ and $j$ ). (Note that we only have to backpropagate through units that were active or whose state was close to the threshold at each time step.)

EvNN Animation

Which RNN types are currently supported?

What's included in this project?

  • a PyTorch API (evnn_pytorch) for event based neural networks

Install

Here's what you'll need to get started:

Once you have the prerequisites, you can install with pip or by building the source code.

Building from source

Note

Currenty supported only on Linux, use Docker for building on Windows.

make evnn_pytorch # Build PyTorch API

If you built the PyTorch API, install it with pip:

pip install evnn_pytorch-*.whl

If the CUDA Toolkit that you're building against is not in /usr/local/cuda, you must specify the $CUDA_HOME environment variable before running make:

CUDA_HOME=/usr/local/cuda-10.2 make

Performance

Code for the experiments and benchmarks presented in the paper are published in benchmarks directory. Note that these benchmarks have additional dependencies as documented in benchmarks/requirements.txt

Documentation

PyTorch API

import torch
import evnn_pytorch as evnn

# setting use_custom_cuda=False makes the model use pytorch code instead of EvNN extension
egru_layer =  evnn.EGRU(input_size, hidden_size, zoneout=0.0, batch_first=True,
                        use_custom_cuda=True)

egru_layer.cuda()

# `x` is a CUDA tensor with shape [N,T,C]
x = torch.rand([5, 25, 128]).cuda()

y, state = egru_layer(x)

The PyTorch API is documented in docs/pytorch/evnn_pytorch.md.

Code layout

Testing

use python unittest with this command

  • Numpy is required for testing
python -m unittest discover -p '*_test.py' -s validation

Note

Tests will fail if you set the the dimensions (batch_size,time_steps,input_size, hidden_size) too high, this is because floating point errors can accumulate and cause the units to generate events one timestep off. This causes the numerical tests to fail but the Neural Network training will work without any issues.

Implementation notes

  • the EGRU is based on Haste GRU which is in turn based on 1406.1078v1 (same as cuDNN) rather than 1406.1078v3

References

  1. Nanavati, Sharvil, ‘Haste: A Fast, Simple, and Open RNN Library’, 2020 https://github.com/lmnt-com/haste/

  2. K. Cho, B. van Merriënboer, C. Gulcehre, D. Bahdanau, F. Bougares, H. Schwenk, and Y. Bengio. Learning phrase representations using RNN encoder–decoder for statistical machine translation. In Proceedings of the 2014 Conference on Empirical Methods in Natural Language Processing (EMNLP), pages 1724–1734, Doha, Qatar, Oct. 2014. Association for Computational Linguistics. doi: 10.3115/v1/D14-1179. URL

Citing this work

To cite this work, please use the following BibTeX entry:

@inproceedings{
evnn2023,
title={Efficient recurrent architectures through activity sparsity and sparse back-propagation through time},
author={Anand Subramoney and Khaleelulla Khan Nazeer and Mark Sch{\"o}ne and Christian Mayr and David Kappel},
booktitle={The Eleventh International Conference on Learning Representations },
year={2023},
url={https://openreview.net/forum?id=lJdOlWg8td},
howpublished={https://github.com/KhaleelKhan/EvNN/}
}

License

Apache 2.0

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

evnn_pytorch-0.1.0.tar.gz (27.3 kB view details)

Uploaded Source

File details

Details for the file evnn_pytorch-0.1.0.tar.gz.

File metadata

  • Download URL: evnn_pytorch-0.1.0.tar.gz
  • Upload date:
  • Size: 27.3 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/4.0.2 CPython/3.9.16

File hashes

Hashes for evnn_pytorch-0.1.0.tar.gz
Algorithm Hash digest
SHA256 f1d79b714d9b5ea33040f56a62c9686d4b0dc7c0bc5ab1f3a2713c432d5808dd
MD5 10136932786f51dcac1882dd61b972f0
BLAKE2b-256 f0d863be06a6d4010245c486639d9121ff49abba699109733c1c71374e6da345

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page