Neural Networks for Neuroscience Research
Project description
NN4N: Neural Networks for Neuroscience
Some of the most commonly used neural networks in neuroscience research are included in this project to ease the implementation process.
Table of contents
Acknowledgements
Immense thanks to Christopher J. Cueva for his mentorship in developing this project. This project can't be done without his invaluable help.
Change Logs
Install
Install using pip
pip install nn4n
Install from GitHub
Clone the repository
git clone https://github.com/zhaozewang/NN4Neurosci.git
Navigate to the NN4Neurosci directory
cd NN4Neurosci/
pip install -e .
Model
CTRNN
The implementation of standard continuous-time RNN (CTRNN). This implementation supports enforcing sparsity constraints (i.e., preventing new synapses from being created) and E/I constraints (i.e., enforcing Dale's law). This ensures that the gradient descent will update synapses with biological plausible constraints.
Structure
In CTRNN implementation, the hidden layer structure can be easily controlled by specifying sparsity masks and E/I masks. We put all RNN update logic in the model
module and all structure-related logic in the structure
module to streamline the implementation process.
We also emphasize on the structure more as it is often more informative to the underlying biological mechanisms. For instance, we might require different module sizes, or we require a multi-module network with E/I constraints; the implementation might be verbose and error-prone. Our following implementation will allow you to easily achieve these goals by simply specifying a few parameters.
Multi-Area
The following implementation groups neurons into areas depending on your specification. Areas can have different sizes and different connectivities to other areas.
Multi-Area with E/I constraints
On top of modeling brain with multi-area hidden layer, another critical constraint would be Dale's law, as proposed in the paper Training Excitatory-Inhibitory Recurrent Neural Networks for Cognitive Tasks: A Simple and Flexible Framework by Song et al. 2016. The MultiAreaEI
class in the structure
module is designed to implement such a connectivity matrix.
Additionally, how different E/I regions connect to each other could also be tricky; we parameterized this process such that it can be controlled with only two lines of code.
Random Input
Neurons' dynamic receiving input will be heavily driven by the inputting signal. Injecting signals to only part of the neuron will result in more versatile and hierarchical dynamics. See A Versatile Hub Model For Efficient Information Propagation And Feature Selection. This is supported by RandomInput
class
- Example to be added
Criterion
RNNLoss
The loss function is modularized. The RNNLoss
class is designed in a modular fashion and includes the most commonly used loss functions, such as Frobenius norm on connectivity, metabolic cost, reconstruction loss, etc.
Others
For similar projects:
Project details
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
File details
Details for the file nn4n-1.0.3.tar.gz
.
File metadata
- Download URL: nn4n-1.0.3.tar.gz
- Upload date:
- Size: 19.5 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/4.0.2 CPython/3.10.13
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 | ed6c4a53e8b4327dfe0171a4343f51ba6a5bec76bd6cd00e293f49d525b405d1 |
|
MD5 | a3caec5815ff5de1d7cea8c659f272db |
|
BLAKE2b-256 | 837f29d8242f32bc5321310756f9cc766e25fc97233a6d916b2f902e370e1a0c |