Skip to main content

Easy-to-use package for the modeling and analysis of neural network dynamics, directed towards cognitive neuroscientists.

Project description

# PsychRNN

This package is intended to help cognitive scientist easily translate task designs from human or primate behavioral experiments into a form capable of being used as training data for a recurrent neural network.

We have isolated the front-end task design, in which users can intuitively describe the conditional logic of their task from the backend where gradient descent based optimization occurs. This is intended to facilitate researchers who might otherwise not have an easy implementation available to design and test hypothesis regarding the behavior of recurrent neural networks in different task environements.

Code is written and upkept by: @davidbrandfonbrener @dbehrlic @ABAtanasov @syncrostone

## Install

### Dependencies

  • Numpy
  • Tensorflow
  • Python=2.7 or 3.6

For Demos: - Jupyter - Ipython - Matplotlib

### Installation

git clone cd PsychRNN python install

#### Alternative Install

pip install PsychRNN

## 17 Lines Introduction

A minimal introduction to our package. In this simple introduction you can generate a new recurrent neural network model, train that model on the random dot motion discrimination task, and plot out an example output in just 17 lines.

import psychrnn from psychrnn.tasks import rdm as rd from psychrnn.backend.models.basic import Basic import tensorflow as tf

from matplotlib import pyplot as plt %matplotlib inline

rdm = rd.RDM(dt = 10, tau = 100, T = 2000, N_batch = 128) gen = rdm.batch_generator()

params = rdm.__dict__ params[‘name’] = ‘model’ params[‘N_rec’] = 50

model = Basic(params) model.train(gen)

x,_,_ = next(gen)



Code for this example can be found in “Minimal_Example.ipynb”

## Demonstration Notebook

For a more complete tour of training and model parameters see the “RDM.ipynb” notebook.

## Writing a New Task

You can easily begin running your own tasks by writing a new task subclass with the two functions (generate_trial_params, trial_function) specified below, or by modifying one of our existing task files such as “” or “”.

Class your_new_class(Task):

def __init__(self, N_in, N_out, dt, tau, T, N_batch):

super(RDM,self).__init__(N_in, N_out, dt, tau, T, N_batch)


N_in: number of network inputs N_out: number of network output dt: simulation time step tau: unit time constant T: trial length N_batch: number of trials per training update


def generate_trial_params(self,batch,trial):

‘’’ function that produces trial specific params for your task (e.g. coherence for the
random dot motion discrimination task)
batch: # of batch for training (for internal use) trial: # of trial within a batch (for internal use)

params: A dictionary of necessary params for trial_function


def trial_function(self,t,params):

‘’‘function that specifies conditional network input, target output and loss mask for your task at a given time (e.g. if t>stim_onset x_t=1).

t: time params: params dictionary from generate_trial_params

x_t: input vector of length N_in at time t y_t: target output vector of length N_out at time t mask_t: loss function mask vector of length N_out at time t


## Building a New Model

New models can be added by extending the RNN superclass, as in our examples of “” and “”. Each new model class requires three functions (recurrent_timestep, output time_step and forward_pass).

Class your_new_model(RNN):

def recurrent_timestep(self, rnn_in, state):

‘’‘function that updates the recurrent state of your network one timestep

rnn_in: network input vector of length N_in at t state: network state at t

new_state: network state at t+1


def output_timestep(self, state):

‘’‘function that produces output for the current state of your network at one timestep

state: network state at t

output: output vector of length N_out at t


def forward_pass(self):

‘’‘function that contains the loop of calls to recurrent_timestep and output_timestep to run the evolution of your state through a trial


## Further Extensibility

If you wish to modify weight initializations, loss functions or regularizations it is as simple as adding an additional class to “” describing your preferred initial weight patterns or a single function to “” or “”.

### Backend

  • initializations
  • loss_functions
  • regularizations
  • rnn
  • simulation

Project details

Release history Release notifications

This version
History Node


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Filename, size & hash SHA256 hash help File type Python version Upload date
PsychRNN-0.3-py2.py3-none-any.whl (16.7 kB) Copy SHA256 hash SHA256 Wheel py2.py3
PsychRNN-0.3.tar.gz (13.5 kB) Copy SHA256 hash SHA256 Source None

Supported by

Elastic Elastic Search Pingdom Pingdom Monitoring Google Google BigQuery Sentry Sentry Error logging AWS AWS Cloud computing DataDog DataDog Monitoring Fastly Fastly CDN SignalFx SignalFx Supporter DigiCert DigiCert EV certificate StatusPage StatusPage Status page