Skip to main content

Python package to process and visualize TDT fiber photometry data

Project description

FibPhoFlow

FibPhoFlow is an open-source fiber photometry data analysis package compatible with data obtained from TdT fiber photometry hardware and software.

Installation

This package is currently available on PyPi for download with pip, but certain dependency issues associated with the HDF5 data storage related python packages, h5py and pytables, tend to cause build errors in different environments. The simplest way to use this python package is to use the python package manager, Anaconda. If you are trying to pip install this package without Conda you will need to download the C package HDF5. The instructions that follow are for the Conda installation.

Step 1:

  • Download Anaconda or Miniconda
  • Download an IDE which allows you to see plots, such as Spyder or Jupyterlab
    • You can install Spyder or Jupyterlab once you have Conda with one of these:
      conda install -c anaconda spyder
      
      conda install -c conda-forge jupyterlab
      

Step 2: Create Conda virtual environment for fibphoflow

  • Open your Terminal
  • Create a folder somewhere (i.e. desktop) to launch Conda environment
mkdir folder_name
cd folder_name
conda env create -f environment.yml
  • Activate Conda venv
conda activate fibphoflow_venv
  • For more information on general conda venv usage, see here

Step 3: Connect the Conda venv with your IDE

Overview of Workflow

  • Note that a well documented practice analysis walkthrough will be added soon.
  1. The raw fiber photometry data streams from TdT photometry recording directories are loaded using the TdT python package. TdT recording files need to be located in the working directory or its subdirectories for them to be located while running fibphoflow.py. The script is currently only compatible with a GCaMP and UV/Isosbestic stream.

  2. A simple low pass filtering step (a moving average) is applied to help smooth out signal noise

  3. The streams (GCaMP and UV) are downsampled based on the hz value one sets.

  4. A normalization of the calcium stream is performed using the isosbestic stream

  5. From here, based on the experimental metadata found in the experiment's excel file, the processed streams from step 3 are chopped up into specific "epoch" traces for analysis and these traces are normalized to user-defined recording baseline periods to obtain traces in terms of delta F/F.

To-do's

  1. Fix package dependency issues and allow for download with conda-forge

  2. Add option for butterworth low pass filtering step.

  3. Make things more Jupyter compatible so that report printouts can be made easier

  4. Clean up code commenting/documentation, along with error messages

  5. Add third red calcium channel

  6. Add z-scoring

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

fibphoflow-0.1.12.dev0.tar.gz (17.0 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

fibphoflow-0.1.12.dev0-py3-none-any.whl (12.9 kB view details)

Uploaded Python 3

File details

Details for the file fibphoflow-0.1.12.dev0.tar.gz.

File metadata

  • Download URL: fibphoflow-0.1.12.dev0.tar.gz
  • Upload date:
  • Size: 17.0 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/4.0.2 CPython/3.11.6

File hashes

Hashes for fibphoflow-0.1.12.dev0.tar.gz
Algorithm Hash digest
SHA256 301a26e5112cdd3da799163e63eb087e2d370d5f96eee21954891ea877339274
MD5 9ffdb19310879fea7047e6498a02e0d8
BLAKE2b-256 e6b64c376f3afa712cfe9d4e29e52c4525f146f1507723020364c6600dee2fda

See more details on using hashes here.

File details

Details for the file fibphoflow-0.1.12.dev0-py3-none-any.whl.

File metadata

File hashes

Hashes for fibphoflow-0.1.12.dev0-py3-none-any.whl
Algorithm Hash digest
SHA256 5b61d46062b44112a1f3e88afa3a9761af0778be792a6633826fa981a81fd315
MD5 5645abdd94acf29cfa66a1c796f92bb4
BLAKE2b-256 d9bef000639c9cc2f70f09bf44236100d024ed4cf70905927f56eb27a04a7cba

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page