Skip to main content

Python code for neural time series

Project description

PyNeuroTrace: Python code for Neural Timeseries

Tests Documentation

Installation

'pyNeuroTrace' can be installed with pip:

pip install pyNeuroTrace

GPU Supported functions use the Python Library Cupy. This library has the following requirements:

  • NVIDIA CUDA GPU with the Compute Capability 3.0 or larger.

  • CUDA Toolkit: v11.2 / v11.3 / v11.4 / v11.5 / v11.6 / v11.7 / v11.8 / v12.0 / v12.1 / v12.2 / v12.3 / v12.4

'pyNeuroTrace' can be installed with Cupy using pip:

pip install pyNeuroTrace[GPU]

If Cupy fails to build from the wheel using this command try installing Cupy first using a wheel that matches your CUDA Toolkit version ie:

pip install cupy-cuda12x

Followed by this command to install pyNeuroTrace

pip install pyNeuroTrace

For more information in on installing CuPy, see their CuPy installation documentation.

Documentation

To help get started using 'pyNeuroTrace', full documentation of the software and available modules and methods is available at our pyNeuroTrace github.io site.

Vizualization

Probably the most useful section of pyNeuroTrace, there are a number of visualization functions provided to help display the trace data in easy to understand formats. For more details, and visual examples of what is available, please consult the README next to viz.py.

Notebook utilities

Unrelated to neuron time series, but useful when using this regardless, PyNeuroTrace also provides a collection of tools for running analysis in Jupyter notebooks.

These include:

notebook.filePicker to selecting an existing file, notebook.newFilePicker for indicating a new path, and notebook.folderPicker for selecting an existing folder.

These all open PyQT file dialogs, to make it easy for users to interact with the file system. Customisation options exist, for example prompt and default locations.

showTabs(data, func, titles, progressBar=False)

Useful for performing analysis repeated multiple times across e.g. different neurons, experiments, conditions, ...etc. Given either a list, or dictionary, all items will be iterated over, and each individually drawn onto their own tab with the provided func. The method provided expects func(idx, key, value), where idx is the (0-based) index of the tab, key is the list/dictionary key, and value is what is to be processed.

Processing Data

Common per-trace processing filters are provided within filters.py. These are all designed to take a numpy array of traces, with each row an independent trace, and all return a filtered array of the same size.

These include:

filters.deltaFOverF0(traces, hz, t0, t1, t2)

Converts raw signal to the standard Delta-F-over-F0, using the technique given in Jia et al, 2011. The smoothing parameters (t0, t1, t2) are as described in the paper, all with units in seconds. Sample rate must also be provided to convert these to sample units.

filters.okada(traces)

Reduces noise in traces by smoothing single peaks or valleys, as described in Okada et al, 2016

Event Detection

Python implementations of the three algorithms discussed in our paper Sakaki et al, 2018 for finding events within Calcium fluorescent traces.

ewma(data, weight)

calculates the Exponentially-Weighted Moving Average for each trace, given how strongly to weight new points (vs. the previous average).

cusum(data, slack)

calculates the Cumulative Sum for each trace, taking a slack parameter which controls how far from the mean the signal the signal needs to be to not be considered noise.

matchedFilter(data, windowSize, A, tA, tB)

calculates the likelihood ratio for each sample to be the end of a window of expected transient shape, being a double exponential with amplitude A, rise-time tA, and decay-time tB (in samples).

The results of each of these three detection filters can then be passed through thresholdEvents(data, threshold), to register detected events whenever the filter strength increases above the given threshold.

Reading Data (lab-specific)

The code within this repository was designed to read data from experiments performed by the Kurt Haas lab at UBC. If you're from this lab, read below. If not, this part is probably not relevant, but fee free to ask if you'd be interested in loading your own data file formats.

A number of options for loading data files are available within files.py, including:

  • load2PData(path) takes an experiment output file (e.g STEP_5_EXPT.TXT) and returns ID, XYZ location, and raw intensity values for each node in the experiment.
  • loadMetadata(path) takes a metadata file (e.g. rscan_metadata_step_5.txt) and returns the stimulus start/stop samples, as well as the sample rate for the experiment.
  • loadTreeStructure(path) takes a tree structure file (e.g. interp-neuron-.txt) and returns the mapping of node IDs to tree information about that node (e.g. node type, children, ...).

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

pyneurotrace-1.0.1.tar.gz (34.8 kB view details)

Uploaded Source

Built Distribution

pyNeuroTrace-1.0.1-py3-none-any.whl (35.6 kB view details)

Uploaded Python 3

File details

Details for the file pyneurotrace-1.0.1.tar.gz.

File metadata

  • Download URL: pyneurotrace-1.0.1.tar.gz
  • Upload date:
  • Size: 34.8 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/5.1.0 CPython/3.12.5

File hashes

Hashes for pyneurotrace-1.0.1.tar.gz
Algorithm Hash digest
SHA256 4b0cb285464d8e315dfafca8164f8f25e04e66e838686aa10a8abf008a161a15
MD5 647f3458401125a398a3cc8605085ec9
BLAKE2b-256 5f2be5b2c26278f74947df4740763f177957eb5e9a0f20a786e9ee38ecec0b8c

See more details on using hashes here.

File details

Details for the file pyNeuroTrace-1.0.1-py3-none-any.whl.

File metadata

  • Download URL: pyNeuroTrace-1.0.1-py3-none-any.whl
  • Upload date:
  • Size: 35.6 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/5.1.0 CPython/3.12.5

File hashes

Hashes for pyNeuroTrace-1.0.1-py3-none-any.whl
Algorithm Hash digest
SHA256 6dcb94c7e40fc455f152d9eb0ae12cf54e7406138ac2769568ee30d37bfb31ba
MD5 ebb3cc02fbfd890e82bb0d532b4b5378
BLAKE2b-256 7604665a85446b1f9872165d8177a58498950c5d913ffc594abc76b51411ca71

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page