Skip to main content

Neural Networks for Neuroscience Research

Project description

NN4N: Neural Networks for Neuroscience

License PyPI version Python Package using Conda Downloads Downloads
Some of the most commonly used neural networks in neuroscience research are included in this project to ease the implementation process.

GitHub

Table of contents

Acknowledgements

Immense thanks to Christopher J. Cueva for his mentorship in developing this project. This project can't be done without his invaluable help.

Change Logs

Install

Install using pip

pip install nn4n

Install from GitHub

Clone the repository

git clone https://github.com/zhaozewang/NN4Neurosci.git

Navigate to the NN4Neurosci directory

cd NN4Neurosci/
pip install -e .

Model

CTRNN

The implementation of standard continuous-time RNN (CTRNN). This implementation supports enforcing sparsity constraints (i.e., preventing new synapses from being created) and E/I constraints (i.e., enforcing Dale's law). This ensures that the gradient descent will update synapses with biological plausible constraints.

Structure

In CTRNN implementation, the hidden layer structure can be easily controlled by specifying sparsity masks and E/I masks. We put all RNN update logic in the model module and all structure-related logic in the structure module to streamline the implementation process.
We also emphasize on the structure more as it is often more informative to the underlying biological mechanisms. For instance, we might require different module sizes, or we require a multi-module network with E/I constraints; the implementation might be verbose and error-prone. Our following implementation will allow you to easily achieve these goals by simply specifying a few parameters.

Multi-Area

The following implementation groups neurons into areas depending on your specification. Areas can have different sizes and different connectivities to other areas.

Multi-Area with E/I constraints

On top of modeling brain with multi-area hidden layer, another critical constraint would be Dale's law, as proposed in the paper Training Excitatory-Inhibitory Recurrent Neural Networks for Cognitive Tasks: A Simple and Flexible Framework by Song et al. 2016. The MultiAreaEI class in the structure module is designed to implement such a connectivity matrix.
Additionally, how different E/I regions connect to each other could also be tricky; we parameterized this process such that it can be controlled with only two lines of code.

Random Input

Neurons' dynamic receiving input will be heavily driven by the inputting signal. Injecting signals to only part of the neuron will result in more versatile and hierarchical dynamics. See A Versatile Hub Model For Efficient Information Propagation And Feature Selection. This is supported by RandomInput class

  • Example to be added

Criterion

RNNLoss

The loss function is modularized. The RNNLoss class is designed in a modular fashion and includes the most commonly used loss functions, such as Frobenius norm on connectivity, metabolic cost, reconstruction loss, etc.

Others

For similar projects:

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

nn4n-1.0.3.tar.gz (19.5 kB view hashes)

Uploaded Source

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page