Skip to main content

FangNao: A Just-In-Time compilation approach for neuronal dynamics simulation.

Project description

logo Documentation Status https://anaconda.org/oujago/npbrain/badges/version.svg https://badge.fury.io/py/npbrain.svg

Note: BrainPy is a project under development. More features are coming soon. Contributions are welcome.

Why to use BrainPy

BrainPy is a microkernel framework for SNN (spiking neural network) simulation purely based on native python. It only relies on NumPy. However, if you want to get faster performance,you can additionally install Numba. With Numba, the speed of C or FORTRAN can be obtained in the simulation.

BrainPy wants to provide a highly flexible and efficient SNN simulation framework for Python users. It endows the users with the fully data/logic flow control. The core of the framework is a micro-kernel, and it’s easy to understand (see How NumpyBrain works). Based on the kernel, the extension of the new models or the customization of the data/logic flows are very simple for users. Ample examples (such as LIF neuron, HH neuron, or AMPA synapse, GABA synapse and GapJunction) are also provided. Besides the consideration of flexibility, for accelerating the running speed of NumPy codes, Numba is used. For most of the times, models running on Numba backend is very fast (see examples/benchmark).

Speed comparison with brian2

More details about BrainPy please see our document.

Installation

Install BrainPy using pip:

$> pip install git+https://github.com/PKU-NIP-Lab/BrainPy

Install from source code:

$> python setup.py install

The following packages need to be installed to use BrainPy:

  • Python >= 3.5

  • NumPy >= 1.13

  • Sympy >= 1.2

  • Matplotlib >= 2.0

  • autopep8

Packages recommended to install:

  • Numba >= 0.40.0

  • JAX >= 0.1.0

Define a Hodgkin–Huxley neuron model

import npbrain.numpy as np
import npbrain as nb

def HH(noise=0., E_Na=50., g_Na=120., E_K=-77., g_K=36.,
       E_Leak=-54.387, g_Leak=0.03, C=1.0, Vth=20.):

    ST = nb.types.NeuState(
        {'V': -65., 'm': 0., 'h': 0., 'n': 0., 'sp': 0., 'inp': 0.},
        help='Hodgkin–Huxley neuron state.\n'
             '"V" denotes membrane potential.\n'
             '"n" denotes potassium channel activation probability.\n'
             '"m" denotes sodium channel activation probability.\n'
             '"h" denotes sodium channel inactivation probability.\n'
             '"sp" denotes spiking state.\n'
             '"inp" denotes synaptic input.\n'
    )

    @nb.integrate
    def int_m(m, t, V):
        alpha = 0.1 * (V + 40) / (1 - np.exp(-(V + 40) / 10))
        beta = 4.0 * np.exp(-(V + 65) / 18)
        return alpha * (1 - m) - beta * m

    @nb.integrate
    def int_h(h, t, V):
        alpha = 0.07 * np.exp(-(V + 65) / 20.)
        beta = 1 / (1 + np.exp(-(V + 35) / 10))
        return alpha * (1 - h) - beta * h

    @nb.integrate
    def int_n(n, t, V):
        alpha = 0.01 * (V + 55) / (1 - np.exp(-(V + 55) / 10))
        beta = 0.125 * np.exp(-(V + 65) / 80)
        return alpha * (1 - n) - beta * n

    @nb.integrate(noise=noise / C)
    def int_V(V, t, m, h, n, Isyn):
        INa = g_Na * m ** 3 * h * (V - E_Na)
        IK = g_K * n ** 4 * (V - E_K)
        IL = g_Leak * (V - E_Leak)
        dvdt = (- INa - IK - IL + Isyn) / C
        return dvdt

    def update(ST, _t_):
        m = np.clip(int_m(ST['m'], _t_, ST['V']), 0., 1.)
        h = np.clip(int_h(ST['h'], _t_, ST['V']), 0., 1.)
        n = np.clip(int_n(ST['n'], _t_, ST['V']), 0., 1.)
        V = int_V(ST['V'], _t_, m, h, n, ST['inp'])
        sp = np.logical_and(ST['V'] < Vth, V >= Vth)
        ST['sp'] = sp
        ST['V'] = V
        ST['m'] = m
        ST['h'] = h
        ST['n'] = n
        ST['inp'] = 0.

    return nb.NeuType(requires={"ST": ST}, steps=update, vector_based=True)

Define an AMPA synapse model

def AMPA(g_max=0.10, E=0., tau_decay=2.0):

    requires = dict(
        ST=nb.types.SynState(['s'], help='AMPA synapse state.'),
        pre=nb.types.NeuState(['sp'], help='Pre-synaptic state must have "sp" item.'),
        post=nb.types.NeuState(['V', 'inp'], help='Post-synaptic neuron must have "V" and "inp" items.')
    )

    @nb.integrate(method='euler')
    def ints(s, t):
        return - s / tau_decay

    def update(ST, _t_, pre):
        s = ints(ST['s'], _t_)
        s += pre['sp']
        ST['s'] = s

    @nb.delayed
    def output(ST, post):
        post_val = - g_max * ST['s'] * (post['V'] - E)
        post['inp'] += post_val

    return nb.SynType(requires=requires, steps=(update, output), vector_based=False)

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

fangnao-0.1.0.tar.gz (114.6 kB view details)

Uploaded Source

File details

Details for the file fangnao-0.1.0.tar.gz.

File metadata

  • Download URL: fangnao-0.1.0.tar.gz
  • Upload date:
  • Size: 114.6 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/2.0.0 pkginfo/1.5.0.1 requests/2.24.0 setuptools/49.2.1.post20200807 requests-toolbelt/0.9.1 tqdm/4.48.2 CPython/3.7.7

File hashes

Hashes for fangnao-0.1.0.tar.gz
Algorithm Hash digest
SHA256 5cdc01c1716cc6fa1b04ab7cd076c463126f688121e5278e5109c6c41af430a0
MD5 281afa20f6efb96ee1aff8aedd57285b
BLAKE2b-256 57421b2ae3c658dce62bf5ed5b4db2058a78baae5e40982d8053afced62ddbfb

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page