Python library for agentic spike train analysis of neural electrophysiology data
Project description
SpikeLab
SpikeLab is a Python library for loading, analyzing, visualizing, and exporting neuronal spike train data from multi-electrode array (MEA) electrophysiology experiments.
๐ Documentation: spikelab.braingeneers.gi.ucsc.edu
What SpikeLab can do
- Load data from common neuroscience formats (HDF5, NWB, KiloSort/Phy, SpikeInterface)
- Represent spike trains as
SpikeDataobjects with per-unit spike times in milliseconds - Compute firing rates as
RateDataobjects (instantaneous firing rates binned over time) - Slice around events to create
SpikeSliceStackorRateSliceStackobjects for event-aligned analysis - Conduct analyses at the single unit, pairwise and population level
- Export data to KiloSort, NWB, and other formats
- Store and organize results using the
AnalysisWorkspacefor multi-stage analysis projects - Access programmatically via a built-in MCP server for tool-based workflows
- Run spike sorting on electrophysiology recordings with built-in pipelines for Kilosort2, Kilosort4, and rt-sort (
spikelab.spike_sorting) - Submit batch jobs to remote Kubernetes clusters for compute-heavy workloads via
spikelab.batch_jobs
Installation
Prerequisites
You need Python 3.10 or later. If you don't have Python installed, we recommend installing it via Miniconda.
Option 1: pip install (recommended)
pip install spikelab
This installs SpikeLab and its core dependencies (numpy, scipy, matplotlib, h5py).
Option 2: conda environment
If you prefer a conda environment with all dependencies pre-configured:
git clone https://github.com/braingeneers/SpikeLab.git
cd SpikeLab
conda env create -f environment.yml
conda activate spikelab
pip install spikelab
Option 3: install from source
For development, clone the repository and install in editable mode:
git clone https://github.com/braingeneers/SpikeLab.git
cd SpikeLab
pip install -e .
Verify the installation
Open a Python prompt and run:
from spikelab import SpikeData
print("SpikeLab is installed correctly!")
If you see the success message, you're ready to go.
Optional dependencies
Some features require additional packages that are not installed by default. Install them by appending the extra in brackets:
pip install "spikelab[s3]"
pip install "spikelab[s3,ml,mcp]" # multiple extras
pip install "spikelab[all]" # everything except kilosort4
| Extra | Install command | What it enables |
|---|---|---|
mcp |
pip install "spikelab[mcp]" |
Built-in MCP server for tool-based workflows |
sse |
pip install "spikelab[sse]" |
SSE transport for the MCP server (uvicorn + starlette) |
s3 |
pip install "spikelab[s3]" |
Upload/download data from Amazon S3 (or any S3-compatible store) |
io |
pip install "spikelab[io]" |
Extra I/O helpers (pandas) |
ml |
pip install "spikelab[ml]" |
scikit-learn, UMAP, networkx, python-louvain |
neo |
pip install "spikelab[neo]" |
NWB / neo / quantities for reading NWB files |
ibl |
pip install "spikelab[ibl]" (+ pip install git+https://github.com/int-brain-lab/paper-brain-wide-map.git) |
Query and load IBL Brain-Wide Map datasets (ONE-api; brainwidemap not on PyPI) |
gplvm |
pip install "spikelab[gplvm]" |
Gaussian Process Latent Variable Model fitting |
numba |
pip install "spikelab[numba]" |
Numba-accelerated routines |
spike-sorting |
pip install "spikelab[spike-sorting]" (+ MATLAB for Kilosort2) |
Kilosort2 / rt-sort pipelines via spikelab.spike_sorting |
kilosort4 |
pip install "spikelab[kilosort4]" (+ PyTorch with CUDA, installed separately) |
Kilosort4 pipeline |
batch-jobs |
pip install "spikelab[batch-jobs]" |
Submit jobs to remote Kubernetes clusters (spikelab-batch-jobs CLI) |
docs |
pip install "spikelab[docs]" |
Sphinx + theme + autodoc-typehints for building the docs |
dev |
pip install "spikelab[dev]" |
pytest, black, and other dev utilities |
all |
pip install "spikelab[all]" |
All of the above except kilosort4 |
When installing from a local source checkout, replace spikelab with -e . (e.g. pip install -e ".[s3]").
Quick start
from spikelab import SpikeData
from spikelab.data_loaders import load_spikedata_from_nwb
# Load spike data from an NWB file
sd = load_spikedata_from_nwb("recording.nwb")
# Basic properties
print(f"Units: {sd.N}")
print(f"Duration: {sd.length} ms")
# Compute instantaneous firing rates (100 ms bins)
rates = sd.rates(bin_size=100.0)
# Get a binary spike raster (1 ms bins)
raster = sd.raster(bin_size_ms=1.0)
# Compute pairwise spike time tiling coefficients
sttc_matrix = sd.spike_time_tilings(delt=20.0)
# Export to KiloSort format
sd.to_kilosort("ks_output/", fs_Hz=20000.0)
Key concepts
- All spike times are in milliseconds throughout the library.
SpikeDataholds per-unit spike times and is the starting point for all analyses.RateDataholds binned instantaneous firing rates with shape(units, time_bins).SpikeSliceStack/RateSliceStackhold event-aligned slices for comparative analysis.PairwiseCompMatrixholds an N x N comparison matrix (e.g., STTC between unit pairs).AnalysisWorkspacestores intermediate results across multi-stage analysis pipelines.
AI-assisted analysis
SpikeLab includes a set of built-in skills that guide your CLI agent through data analysis, spike sorting, library development, and education โ all through natural language conversation.
How it works
The skills ship inside the installed package at spikelab/agent/skills/. A lightweight spikelab router skill (installed separately into your agent's skills directory) handles environment detection (conda vs. system Python, install if missing) and then delegates to the in-repo skill that best matches the user's request:
| In-repo skill | Use when the user wants toโฆ |
|---|---|
spikelab-analysis-implementer |
Load data, write/run analysis scripts, generate publication-quality figures, manage results, and keep repo maps current |
spikelab-spikesorter |
Sort raw recordings (Kilosort2, Kilosort4, RT-Sort), curate units, run stim-aligned sorting, and inspect sorting outputs |
spikelab-developer |
Promote ad-hoc analysis code into the library โ identify reusable methods, integrate novel computations, write tests, expose via MCP, and submit a PR |
spikelab-educator |
Explain what an analysis does, how a method works, or what a result means โ read-only, no code execution |
spikelab-map-updater |
Regenerate the repo map files after library changes |
CLI agents that load skills from installed packages pick the in-repo skills up automatically; alternatively, copy or symlink them into the agent's skills directory. As an alternative to the skills, MCP tools are available for all methods in the library.
Directory structure
SpikeLab/
โโโ src/
โ โโโ spikelab/ # Installable Python package
โ โโโ spikedata/ # Core data structures and analysis
โ โ โโโ spikedata.py # SpikeData class
โ โ โโโ ratedata.py # RateData class
โ โ โโโ spikeslicestack.py # SpikeSliceStack class
โ โ โโโ rateslicestack.py # RateSliceStack class
โ โ โโโ pairwise.py # PairwiseCompMatrix and PairwiseCompMatrixStack
โ โ โโโ utils.py # Shared utility functions
โ โ โโโ plot_utils.py # Visualization helpers
โ โโโ data_loaders/ # File I/O
โ โ โโโ data_loaders.py # Load from HDF5, NWB, KiloSort, SpikeInterface
โ โ โโโ data_exporters.py # Export to KiloSort, NWB, and other formats
โ โ โโโ s3_utils.py # Amazon S3 upload/download utilities
โ โโโ spike_sorting/ # Spike-sorting pipelines
โ โ โโโ pipeline.py # Top-level sorting pipeline + config
โ โ โโโ ks2_runner.py # Kilosort2 runner (requires MATLAB)
โ โ โโโ ks4_runner.py # Kilosort4 runner (PyTorch / CUDA)
โ โ โโโ rt_sort/ # rt-sort runner
โ โ โโโ stim_sorting/ # Stimulation-aware sorting helpers
โ โโโ workspace/ # Analysis workspace for storing intermediate results
โ โ โโโ workspace.py # AnalysisWorkspace class
โ โ โโโ hdf5_io.py # HDF5 serialization for workspace objects
โ โโโ mcp_server/ # MCP protocol server for programmatic access
โ โ โโโ server.py # MCP server implementation
โ โ โโโ tools/ # MCP tool definitions
โ โโโ batch_jobs/ # Remote Kubernetes job submission
โ โ โโโ cli.py # spikelab-batch-jobs CLI
โ โ โโโ session.py # RunSession entry point
โ โ โโโ policy.py # Pre-submission policy checks
โ โ โโโ profiles/ # Built-in cluster profiles (YAML)
โ โ โโโ templates/ # Jinja2 manifest templates
โ โโโ agent/ # Bundled agent skills (analysis-implementer, โฆ)
โ โโโ skills/
โโโ tests/ # Test suite (pytest)
โโโ docs/ # Sphinx documentation source
โโโ examples/ # Example scripts and notebooks
โโโ environment.yml # Conda environment specification
โโโ pyproject.toml # Package configuration
Running tests
git clone https://github.com/braingeneers/SpikeLab.git
cd SpikeLab
pip install -e ".[dev]"
pytest tests/ -v
Contributing
Contributions are welcome! Please open an issue or pull request on the GitHub repository.
All code must be formatted with Black. You can check formatting with:
black --check .
License
SpikeLab is released under the MIT License.
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file spikelab-0.1.1.tar.gz.
File metadata
- Download URL: spikelab-0.1.1.tar.gz
- Upload date:
- Size: 810.5 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.2.0 CPython/3.10.20
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
9bff2e1591ee8c92396718cc7108b23b14ef9128756153af30a134233696a8bd
|
|
| MD5 |
b30bbbea4194fbc8ebfe844e19da4d7f
|
|
| BLAKE2b-256 |
74dc9f3caefe307096868c0ee5e22d39df01388b762c35e19cd1904e51ba35c9
|
File details
Details for the file spikelab-0.1.1-py3-none-any.whl.
File metadata
- Download URL: spikelab-0.1.1-py3-none-any.whl
- Upload date:
- Size: 466.3 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.2.0 CPython/3.10.20
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
04d90fcf6c4a9eb1d031c116e606414fdcbdffb82d15457e642e1fb62f207a3a
|
|
| MD5 |
f77c1d33d32339c851c1f0dd5f6b5a71
|
|
| BLAKE2b-256 |
497793824f8ad4656a6175e3ae201695a76fcc830adbcae4fd4bff71be22f820
|