Skip to main content

Curate Neurophotometrics data for pMat.

Project description

Python code for curating Neurophotometrics data

This code was written to curate Neurophotometrics (NPM) data for analysis in pMat. In short, the NPM data is saved into two files: one containing the 415 control signals for every region recorded and the other the 470 gcamp signals for all regions recorded. However, pMat (currently) requires the opposite: one .csv files for each region that contains both the 415 and 470 signal. This requires quite a lot of copy and pasting which is tedious and prone to errors. This code was developed to automate this process.

How to use this code?

A specific file structure is necessary for using the NPM python module for curating Neurophotometric data. The below example is a minimum necessary structure for the code to work. In short, you must input a directory that contains subdirectoriesfor each subject that contain the raw NPM data files (which also need to be renamed to .NPM.csv in order to be detected).

Data/      <---- This is the directory (path) that should be input to the curate_NPM() function. 
|-- Rat1/
|   |-- Rat1_415_data.npm.csv
|   |-- Rat1_470_data.npm.csv
|-- Rat2/
|   |-- Rat2_415_data.npm.csv
|   |-- Rat2_470_data.npm.csv
| ...
|-- RatN/
|   |-- RatN_415_data.npm.csv
|   |-- RatN_470_data.npm.csv

For a more general project file tree, I highly recommend something like the following to keep all of the experimental days and freezing data organized.

Data/
|-- Day1/          <---- This is the directory (path) that should be input to the "curated_NPM()" function. 
|   | -- Rat1/
|   |   |-- Rat1_415_data.npm.csv
|   |   |-- Rat1_470_data.npm.csv
|   |   |-- Freezing data/
|   |   |   |-- freezing_files       <---- Notice that freezing files are kept in their own folder
|
|-- Day2/           <---- This is the directory (path) that should be input to the "curated_NPM()" function. 
|   | -- Rat1/
|   |   |-- Rat1_415_data.npm.csv
|   |   |-- Rat1_470_data.npm.csv
|   |   |-- Freezing data/
|   |   |   |-- freezing_files

General work flow

  1. Organize the data into the above file structure

  2. Rename all NPM data to have ".NPM.csv" at the end

  3. Open your desired IDE (jupyter, spyder, etc)

  4. Import this module

    import NPMpy as NPM

  5. Run curate_NPM(path_to_your_data)

  6. Done!

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

NPMpy-0.1.6.tar.gz (3.9 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

NPMpy-0.1.6-py3-none-any.whl (4.9 kB view details)

Uploaded Python 3

File details

Details for the file NPMpy-0.1.6.tar.gz.

File metadata

  • Download URL: NPMpy-0.1.6.tar.gz
  • Upload date:
  • Size: 3.9 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.3.0 pkginfo/1.6.1 requests/2.25.1 setuptools/51.0.0.post20201207 requests-toolbelt/0.9.1 tqdm/4.55.0 CPython/3.7.7

File hashes

Hashes for NPMpy-0.1.6.tar.gz
Algorithm Hash digest
SHA256 61a94923b250134f2918d0758accd70019589de65c570d82c942ab65ddf85b56
MD5 f4586a3cbc4894820e39e94ef03fae1b
BLAKE2b-256 c17509350fbf590a7d637b0a1306079fb41edb2c4f4478a34cb0e62aee7717fa

See more details on using hashes here.

File details

Details for the file NPMpy-0.1.6-py3-none-any.whl.

File metadata

  • Download URL: NPMpy-0.1.6-py3-none-any.whl
  • Upload date:
  • Size: 4.9 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.3.0 pkginfo/1.6.1 requests/2.25.1 setuptools/51.0.0.post20201207 requests-toolbelt/0.9.1 tqdm/4.55.0 CPython/3.7.7

File hashes

Hashes for NPMpy-0.1.6-py3-none-any.whl
Algorithm Hash digest
SHA256 73fa45f22ba0ddaca9fea0f6f7b2c0557fc114c3e1da5a781a415d4cfb6f034d
MD5 b601b8a563c2b70b835231b066694f97
BLAKE2b-256 6387719c9e43ced317aac7d74eca9a0386beea4e423122b161100ebc786c5b29

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page