Self-Organizing Recurrent Neural Networks
Project description
Self-Organizing Recurrent Neural Networks
SORN is a class of neuro-inspired artificial network build based on plasticity mechanisms in biological brain and mimic neocortical circuits ability of learning and adaptation. SORN consists of pool of excitatory neurons and small population of inhibitory neurons which are controlled by 5 plasticity mechanisms found in neocortex, namely Spike Timing Dependent Plasticity (STDP), Intrinsic Plasticity (IP), Synaptic Scaling (SS),Synaptic Normalization(SN) and inhibitory Spike Timing Dependent Plasticity (iSTDP). Using mathematical tools, SORN network simplifies the underlying structural and functional connectivity mechanisms responsible for learning and memory in the brain
'sorn' is a Python package designed for Self Organizing Recurrent Neural Networks. It provides a research environment for computational neuroscientists to study the self-organization, adaption, learning,memory and behavior of brain circuits by reverse engineering neural plasticity mechanisms. Further to extend the potential applications of sorn
, a demostrative example of a neuro-robotics experiment using OpenAI gym is also documented.
SORN Reservoir
Installation
pip install sorn
or to install the latest version from the development branch
pip install git+https://github.com/Saran-nns/sorn
Dependencies
SORN supports Python 3.7+
ONLY. For older Python versions please use the official Python client.
To install all optional dependencies,
pip install 'sorn[all]'
Usage
Plasticity Phase
import sorn
from sorn import Simulator
import numpy as np
# Sample input
num_features = 10
time_steps = 200
inputs = np.random.rand(num_features,time_steps)
# Simulate the network with default hyperparameters under gaussian white noise
state_dict, sim_dict = Simulator.run(inputs=inputs, phase='plasticity',
state=None, noise=True,
timesteps=time_steps,
callbacks=["ExcitatoryActivation",
"WEE",
"EEConnectionCounts"])
Network Initialized
Number of connections in Wee 3909 , Wei 1574, Wie 8000
Shapes Wee (200, 200) Wei (40, 200) Wie (200, 40)
Training Phase
from sorn import Trainer
# NOTE: During training phase, input to `sorn` should have second (time) dimension set to 1. ie., input shape should be (input_features,1).
inputs = np.random.rand(num_features,1)
# SORN network is frozen during training phase
state_dict, sim_dict = Trainer.run(inputs= inputs, phase='training',
state=state_dict, noise=False,
timesteps=1,
ne=100, nu=num_features,
lambda_ee=10, eta_stdp=0.001,
callbacks=["InhibitoryActivation",
"WEI",
"EIConnectionCounts"] )
Network Output Descriptions
state_dict
- Dictionary of connection weights (Wee
, Wei
, Wie
) , Excitatory network activity (X
), Inhibitory network activities(Y
), Threshold values (Te
, Ti
)
sim_dict
- Dictionary of network states and parameters collected during the simulation/training: Provided, all available options of the argument callbacks
, then the sim_dict
should contain the following;
"ExcitatoryActivation" - Excitatory network activity of entire simulation period
"InhibitoryActivation" - Inhibitory network activity of entire simulation period
"RecurrentActivation" - Recurrent network activity of entire simulation period
"EEConnectionCounts" - Number of active connections in the Excitatory pool at each time step
"EIConnectionCounts" - Number of active connections from Inhibitory to Excitatory pool at each time step
"TE" - Threshold values of excitatory neurons at each time step
"TI" - Threshold values of inhibitory neurons at each time step
"WEE" - Synaptic efficacies between excitatory neurons
"WEI" - Connection weights from inhibitory to excitatory neurons
"WIE" - Connection weights from excitatory to inhibitory neurons
Documentation
For detailed documentation about development, analysis, plotting methods and a sample experiment with OpenAI Gym, please visit SORN Documentation
Citation
@article{Nambusubramaniyan2021,
doi = {10.21105/joss.03545},
url = {https://doi.org/10.21105/joss.03545},
year = {2021},
publisher = {The Open Journal},
volume = {6},
number = {65},
pages = {3545},
author = {Saranraj Nambusubramaniyan},
title = {`sorn`: A Python package for Self Organizing Recurrent Neural Network},
journal = {Journal of Open Source Software}
}
Contributions
I am welcoming contributions. If you wish to contribute, please create a branch with a pull request and the changes can be discussed there. If you find a bug in the code or errors in the documentation, please open a new issue in the Github repository and report the bug or the error. Please provide sufficient information for the bug to be reproduced.
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
File details
Details for the file sorn-0.7.4.tar.gz
.
File metadata
- Download URL: sorn-0.7.4.tar.gz
- Upload date:
- Size: 24.2 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/4.0.0 CPython/3.8.0
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 | bab6253253873fd9ef66fcf4a1924d4b613cb73af6c7d79777c6d949479b0c97 |
|
MD5 | 55d4359b4c1e2046453cd17663595f39 |
|
BLAKE2b-256 | 826f9e91eac1ad0241f45023034aae79684b965de0e172fe2629a18be96f834c |
File details
Details for the file sorn-0.7.4-py3-none-any.whl
.
File metadata
- Download URL: sorn-0.7.4-py3-none-any.whl
- Upload date:
- Size: 24.2 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/4.0.0 CPython/3.8.0
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 | 682d83eeb3250b5c094a48a3346035e201fe6cc2dc83ae8788237c1cc5a50635 |
|
MD5 | 43201fe8380159ffb29ab7cfe643e4dc |
|
BLAKE2b-256 | 7ea7af6372f8fa37423b7f2e177aaf5fcfce592f9859d77aae344bff81ed7d9f |