Skip to main content

Curate Neurophotometrics data for pMat.

Project description

Python code for curating Neurophotometrics data

This code was written to curate Neurophotometrics (NPM) data for analysis in pMat. In short, the NPM data is saved into two files: one containing the 415 control signals for every region recorded and the other the 470 gcamp signals for all regions recorded. However, pMat (currently) requires the opposite: one .csv files for each region that contains both the 415 and 470 signal. This requires quite a lot of copy and pasting which is tedious and prone to errors. This code was developed to automate this process.

How to use this code?

A specific file structure is necessary for using the NPM python module for curating Neurophotometric data. The below example is a minimum necessary structure for the code to work. In short, you must input a directory that contains subdirectoriesfor each subject that contain the raw NPM data files (which also need to be renamed to .NPM.csv in order to be detected).

Data/      <---- This is the directory (path) that should be input to the curate_NPM() function. 
|-- Rat1/
|   |-- Rat1_415_data.npm.csv
|   |-- Rat1_470_data.npm.csv
|-- Rat2/
|   |-- Rat2_415_data.npm.csv
|   |-- Rat2_470_data.npm.csv
| ...
|-- RatN/
|   |-- RatN_415_data.npm.csv
|   |-- RatN_470_data.npm.csv

For a more general project file tree, I highly recommend something like the following to keep all of the experimental days and freezing data organized.

Data/
|-- Day1/          <---- This is the directory (path) that should be input to the "curated_NPM()" function. 
|   | -- Rat1/
|   |   |-- Rat1_415_data.npm.csv
|   |   |-- Rat1_470_data.npm.csv
|   |   |-- Freezing data/
|   |   |   |-- freezing_files       <---- Notice that freezing files are kept in their own folder
|
|-- Day2/           <---- This is the directory (path) that should be input to the "curated_NPM()" function. 
|   | -- Rat1/
|   |   |-- Rat1_415_data.npm.csv
|   |   |-- Rat1_470_data.npm.csv
|   |   |-- Freezing data/
|   |   |   |-- freezing_files

General work flow

  1. Organize the data into the above file structure

  2. Rename all NPM data to have ".NPM.csv" at the end

  3. Open your desired IDE (jupyter, spyder, etc)

  4. Import this module

    import NPMpy as NPM

  5. Run curate_NPM(path_to_your_data)

  6. Done!

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

NPMpy-0.1.2.tar.gz (2.1 kB view details)

Uploaded Source

Built Distribution

NPMpy-0.1.2-py3-none-any.whl (2.8 kB view details)

Uploaded Python 3

File details

Details for the file NPMpy-0.1.2.tar.gz.

File metadata

  • Download URL: NPMpy-0.1.2.tar.gz
  • Upload date:
  • Size: 2.1 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.3.0 pkginfo/1.6.1 requests/2.25.1 setuptools/51.0.0.post20201207 requests-toolbelt/0.9.1 tqdm/4.55.0 CPython/3.7.7

File hashes

Hashes for NPMpy-0.1.2.tar.gz
Algorithm Hash digest
SHA256 bacfb9bf557dedf86b22d86265f052c166d75bd3d76fb272fb4a2e579c4a3bf0
MD5 004f2cb0fbca94ef48ea0f3fc2662e78
BLAKE2b-256 73908a4edcca9229f7014a375bfb7ddbf08c5e4f5c3ca042c8344697b5a02f01

See more details on using hashes here.

Provenance

File details

Details for the file NPMpy-0.1.2-py3-none-any.whl.

File metadata

  • Download URL: NPMpy-0.1.2-py3-none-any.whl
  • Upload date:
  • Size: 2.8 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.3.0 pkginfo/1.6.1 requests/2.25.1 setuptools/51.0.0.post20201207 requests-toolbelt/0.9.1 tqdm/4.55.0 CPython/3.7.7

File hashes

Hashes for NPMpy-0.1.2-py3-none-any.whl
Algorithm Hash digest
SHA256 336885b5ef838f7ee697f887fb5096c65199d5e7a69ab71c44d12b68b2ad7143
MD5 4094b61a9783078a154b233b6740abae
BLAKE2b-256 67171b0062be47e43531baa7e906953db96e18a16366129766265df1219857a0

See more details on using hashes here.

Provenance

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page