Skip to main content

Curate Neurophotometrics data for pMat.

Project description

Python code for curating Neurophotometrics data

This code was written to curate Neurophotometrics (NPM) data for analysis in pMat. In short, the NPM data is saved into two files: one containing the 415 control signals for every region recorded and the other the 470 gcamp signals for all regions recorded. However, pMat (currently) requires the opposite: one .csv files for each region that contains both the 415 and 470 signal. This requires quite a lot of copy and pasting which is tedious and prone to errors. This code was developed to automate this process.

How to use this code?

A specific file structure is necessary for using the NPM python module for curating Neurophotometric data. The below example is a minimum necessary structure for the code to work. In short, you must input a directory that contains subdirectoriesfor each subject that contain the raw NPM data files (which also need to be renamed to .NPM.csv in order to be detected).

Data/      <---- This is the directory (path) that should be input to the curate_NPM() function. 
|-- Rat1/
|   |-- Rat1_415_data.npm.csv
|   |-- Rat1_470_data.npm.csv
|-- Rat2/
|   |-- Rat2_415_data.npm.csv
|   |-- Rat2_470_data.npm.csv
| ...
|-- RatN/
|   |-- RatN_415_data.npm.csv
|   |-- RatN_470_data.npm.csv

For a more general project file tree, I highly recommend something like the following to keep all of the experimental days and freezing data organized.

Data/
|-- Day1/          <---- This is the directory (path) that should be input to the "curated_NPM()" function. 
|   | -- Rat1/
|   |   |-- Rat1_415_data.npm.csv
|   |   |-- Rat1_470_data.npm.csv
|   |   |-- Freezing data/
|   |   |   |-- freezing_files       <---- Notice that freezing files are kept in their own folder
|
|-- Day2/           <---- This is the directory (path) that should be input to the "curated_NPM()" function. 
|   | -- Rat1/
|   |   |-- Rat1_415_data.npm.csv
|   |   |-- Rat1_470_data.npm.csv
|   |   |-- Freezing data/
|   |   |   |-- freezing_files

General work flow

  1. Organize the data into the above file structure

  2. Rename all NPM data to have ".NPM.csv" at the end

  3. Open your desired IDE (jupyter, spyder, etc)

  4. Import this module

    import NPMpy as NPM

  5. Run curate_NPM(path_to_your_data)

  6. Done!

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

NPMpy-0.1.5.tar.gz (3.9 kB view details)

Uploaded Source

Built Distribution

NPMpy-0.1.5-py3-none-any.whl (4.9 kB view details)

Uploaded Python 3

File details

Details for the file NPMpy-0.1.5.tar.gz.

File metadata

  • Download URL: NPMpy-0.1.5.tar.gz
  • Upload date:
  • Size: 3.9 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.3.0 pkginfo/1.6.1 requests/2.25.1 setuptools/51.0.0.post20201207 requests-toolbelt/0.9.1 tqdm/4.55.0 CPython/3.7.7

File hashes

Hashes for NPMpy-0.1.5.tar.gz
Algorithm Hash digest
SHA256 6996b44836b98dc5ee472bac72bc82e725e3ae0248346e4b12104e58f7f191a9
MD5 ab2685ae915ff09bea7090d02411c4ff
BLAKE2b-256 eb897353869e59a22f039d51f64181aee62b366486ff676f7c1d17e395eb8d3b

See more details on using hashes here.

Provenance

File details

Details for the file NPMpy-0.1.5-py3-none-any.whl.

File metadata

  • Download URL: NPMpy-0.1.5-py3-none-any.whl
  • Upload date:
  • Size: 4.9 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.3.0 pkginfo/1.6.1 requests/2.25.1 setuptools/51.0.0.post20201207 requests-toolbelt/0.9.1 tqdm/4.55.0 CPython/3.7.7

File hashes

Hashes for NPMpy-0.1.5-py3-none-any.whl
Algorithm Hash digest
SHA256 b7f6a6fba6a8fa7fbdc94bd5e2cb3c2ab10e8a381b0b8c0245a4b0a4829c8172
MD5 7db42f8576c47cac15e4cd58a2ef50ba
BLAKE2b-256 c139c4fafe752af8f1dcaa057b4d9bc8537b4fd85f9e6f7633bddb1b71e8692e

See more details on using hashes here.

Provenance

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page