Skip to main content

Differentiable neuron simulations.

Project description

Differentiable neuron simulations on CPU, GPU, or TPU

PyPI version Contributions welcome Tests GitHub license

Documentation | Getting Started | Install guide | Reference docs | FAQ

What is Jaxley?

Jaxley is a differentiable simulator for biophysical neuron models, written in the Python library JAX. Its key features are:

  • automatic differentiation, allowing gradient-based optimization of thousands of parameters
  • support for CPU, GPU, or TPU without any changes to the code
  • jit-compilation, making it as fast as other packages while being fully written in python
  • support for multicompartment neurons
  • elegant mechanisms for parameter sharing

Getting started

Jaxley allows to simulate biophysical neuron models on CPU, GPU, or TPU:

import matplotlib.pyplot as plt
from jax import config

import jaxley as jx
from jaxley.channels import HH

config.update("jax_platform_name", "cpu")  # Or "gpu" / "tpu".

cell = jx.Cell()  # Define cell.
cell.insert(HH())  # Insert channels.

current = jx.step_current(i_delay=1.0, i_dur=1.0, i_amp=0.1, delta_t=0.025, t_max=10.0)
cell.stimulate(current)  # Stimulate with step current.
cell.record("v")  # Record voltage.

v = jx.integrate(cell)  # Run simulation.
plt.plot(v.T)  # Plot voltage trace.

Here you can find an overview of what kinds of models can be implemented in Jaxley. If you want to learn more, we recommend you to check out our tutorials on how to:

Installation

Jaxley is available on PyPI:

pip install jaxley

This will install Jaxley with CPU support. If you want GPU support, follow the instructions on the JAX Github repository to install JAX with GPU support (in addition to installing Jaxley). For example, for NVIDIA GPUs, run

pip install -U "jax[cuda12]"

Feedback and Contributions

We welcome any feedback on how Jaxley is working for your neuron models and are happy to receive bug reports, pull requests and other feedback (see contribute). We wish to maintain a positive community, please read our Code of Conduct.

License

Apache License Version 2.0 (Apache-2.0)

Citation

If you use Jaxley, consider citing the corresponding paper:

@article{deistler2024differentiable,
  doi = {10.1101/2024.08.21.608979},
  year = {2024},
  publisher = {Cold Spring Harbor Laboratory},
  author = {Deistler, Michael and Kadhim, Kyra L. and Pals, Matthijs and Beck, Jonas and Huang, Ziwei and Gloeckler, Manuel and Lappalainen, Janne K. and Schr{\"o}der, Cornelius and Berens, Philipp and Gon{\c c}alves, Pedro J. and Macke, Jakob H.},
  title = {Differentiable simulation enables large-scale training of detailed biophysical models of neural dynamics},
  journal = {bioRxiv}
}

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

jaxley-0.5.0.tar.gz (122.2 kB view details)

Uploaded Source

Built Distribution

Jaxley-0.5.0-py3-none-any.whl (159.8 kB view details)

Uploaded Python 3

File details

Details for the file jaxley-0.5.0.tar.gz.

File metadata

  • Download URL: jaxley-0.5.0.tar.gz
  • Upload date:
  • Size: 122.2 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/5.1.1 CPython/3.12.4

File hashes

Hashes for jaxley-0.5.0.tar.gz
Algorithm Hash digest
SHA256 316296f31c39a60bdd109a53fdfa04f01d82f5c5b6743c27b47509df398659d0
MD5 811341bb85797c8a7aea02304923c6f4
BLAKE2b-256 554f4ed76208cc87f69d8d6b02f8fa86ce719588a6af98fe24795543fc222d2b

See more details on using hashes here.

File details

Details for the file Jaxley-0.5.0-py3-none-any.whl.

File metadata

  • Download URL: Jaxley-0.5.0-py3-none-any.whl
  • Upload date:
  • Size: 159.8 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/5.1.1 CPython/3.12.4

File hashes

Hashes for Jaxley-0.5.0-py3-none-any.whl
Algorithm Hash digest
SHA256 35198374191522fd77dd014d112aa145d9d38ba74e086d6c83128032507887d1
MD5 a27afd56d94ff4fd5b5cf590f6d07763
BLAKE2b-256 2720e6f7746f89719806c053f088b4eee915ff6155a16a33ffd020a48545d11f

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page