Skip to main content

CaLab: calcium imaging analysis tools — deconvolution and data preparation

Project description

CaLab Python

Calcium imaging analysis tools — deconvolution and data preparation. Python companion package for the CaLab tools.

Installation

pip install calab

Quick Start

import calab

# Build a calcium kernel
kernel = calab.build_kernel(tau_rise=0.02, tau_decay=0.4, fs=30.0)

# Get AR(2) coefficients
g1, g2, d, r = calab.tau_to_ar2(tau_rise=0.02, tau_decay=0.4, fs=30.0)

# Compute Lipschitz constant for FISTA step size
L = calab.compute_lipschitz(kernel)

Deconvolution

Run FISTA deconvolution matching the CaTune web app's Rust solver exactly (baseline estimation + lambda scaling by kernel DC gain):

import numpy as np
import calab

# Load your calcium traces (n_cells x n_timepoints)
traces = np.load("my_traces.npy")

# Basic: returns non-negative activity array
activity = calab.run_deconvolution(traces, fs=30.0, tau_r=0.02, tau_d=0.4, lam=0.01)

# Full: returns activity, baseline, reconvolution, iterations, converged
result = calab.run_deconvolution_full(traces, fs=30.0, tau_r=0.02, tau_d=0.4, lam=0.01)
print(f"Baseline: {result.baseline}, Converged: {result.converged}")

Note: The deconvolved output represents scaled neural activity, not discrete spikes or firing rates. The signal is scaled by an unknown constant (indicator expression level, optical path, etc.), so absolute values should not be interpreted as spike counts.

Bandpass Filter

Apply the same FFT bandpass filter used in the CaTune web app:

filtered = calab.bandpass_filter(trace, tau_rise=0.02, tau_decay=0.4, fs=100.0)

Using CaTune Export JSON

Load parameters from a CaTune export JSON and run deconvolution:

import calab

# Load export params
params = calab.load_export_params("catune-params-2025-01-15.json")
# -> {'tau_rise': 0.02, 'tau_decay': 0.4, 'lambda_': 0.01, 'fs': 30.0, 'filter_enabled': False}

# One-step pipeline: loads params, optionally filters, and deconvolves
activity = calab.deconvolve_from_export(traces, "catune-params-2025-01-15.json")

Saving Data for CaTune

import calab

calab.save_for_tuning(traces, fs=30.0, path="my_recording")
# Creates my_recording.npy + my_recording_metadata.json
# Load into CaTune browser tool via the .npy file

Converting from CaImAn / Minian

CaLab works with raw calcium traces extracted by any pipeline. Use save_for_tuning() to convert extracted traces into CaTune-compatible format. No additional dependencies are required -- users extract arrays with their existing pipeline tools.

CaImAn

import h5py
import calab

with h5py.File("caiman_results.hdf5", "r") as f:
    traces = f["estimates/C"][:]       # shape: (n_cells, n_timepoints)
    fs = float(f["params/data/fr"][()])

calab.save_for_tuning(traces, fs, "my_recording")
# -> my_recording.npy + my_recording_metadata.json, ready for CaTune

Minian

import zarr
import calab

store = zarr.open("minian_output", mode="r")
traces = store["C"][:]  # shape: (n_cells, n_frames)
fs = 30.0  # user must know their frame rate

calab.save_for_tuning(traces, fs, "my_recording")

Then deconvolve

After tuning parameters in CaTune's browser interface, export your settings and apply them in Python:

import numpy as np
import calab

traces = np.load("my_recording.npy")
activity = calab.deconvolve_from_export(traces, "catune-params.json")
# activity is non-negative deconvolved neural activity (scaled by unknown constant)

API Reference

Function Description
build_kernel(tau_rise, tau_decay, fs) Build double-exponential calcium kernel
tau_to_ar2(tau_rise, tau_decay, fs) Derive AR(2) coefficients from tau values
compute_lipschitz(kernel) Lipschitz constant for FISTA step size
run_deconvolution(traces, fs, tau_r, tau_d, lam) FISTA deconvolution, returns activity
run_deconvolution_full(traces, fs, tau_r, tau_d, lam) Full result with baseline, reconvolution
bandpass_filter(trace, tau_rise, tau_decay, fs) FFT bandpass filter from kernel params
save_for_tuning(traces, fs, path) Save traces for CaTune browser tool
load_tuning_data(path) Load traces saved by save_for_tuning
load_export_params(path) Load params from CaTune export JSON
deconvolve_from_export(traces, params_path) Full pipeline: load params + deconvolve

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

calab-0.1.0.tar.gz (60.0 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

calab-0.1.0-py3-none-any.whl (12.2 kB view details)

Uploaded Python 3

File details

Details for the file calab-0.1.0.tar.gz.

File metadata

  • Download URL: calab-0.1.0.tar.gz
  • Upload date:
  • Size: 60.0 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.12.3

File hashes

Hashes for calab-0.1.0.tar.gz
Algorithm Hash digest
SHA256 1868998e5311f626c0575e673194ba97938acdc4867943318d20289d18b5c36d
MD5 be328825d27ff8123baef1f2097168e0
BLAKE2b-256 6616d997685fb9c855f63af59e0be998ce65cf4bc4b396722d4f188b5d60965b

See more details on using hashes here.

File details

Details for the file calab-0.1.0-py3-none-any.whl.

File metadata

  • Download URL: calab-0.1.0-py3-none-any.whl
  • Upload date:
  • Size: 12.2 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.12.3

File hashes

Hashes for calab-0.1.0-py3-none-any.whl
Algorithm Hash digest
SHA256 f4801050bbc1984ff7578e27be223fac562783918ddf4aff2cee398dcf5a0f46
MD5 c510d4e8b9fe978fad23266b5c7307dd
BLAKE2b-256 9df3874ba17240d582f1807e6c8c1adbb976682e91ecc82c8480e58c2145246c

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page