Tools to fetch and update paths, metadata and state for Mindscope Neuropixels sessions, in the cloud.
Project description
npc_lims
neuropixels cloud lab information management system Tools to fetch and update paths, metadata and state for Mindscope Neuropixels sessions, in the cloud.
quickstart
-
make a new Python >=3.9 virtual environment with conda or venv (lighter option, since this package does not require pandas, numpy etc.):
python -m venv .venv
-
activate the virtual environment:
- Windows
.venv\scripts\activate
- Unix
source .venv/bin/activate.sh
-
install the package:
python -m pip install npc_lims
-
setup credentials
- required environment variables:
- AWS S3
AWS_DEFAULT_REGIONAWS_ACCESS_KEY_IDAWS_SECRET_ACCESS_KEY- to find and read files on S3
- must have read access on relevant aind buckets
- can be in a standard
~/.awslocation, as used by AWS CLI or boto3
- CodeOcean API
CODE_OCEAN_API_TOKENCODE_OCEAN_DOMAIN- to find processed data in "data assets" via the Codeocean API
- generated in CodeOcean:
- right click on
Account(bottom left, person icon) - click
User Secrets- these are secrets than can be made available as environment variables in CodeOcean capsules - go to
Access Tokensand clickGenerate new token- this is for programatically querying CodeOcean's databases- in
Token NameenterCodeocean API (read)and checkreadon capsules and datasets - a token will be generated: click copy (storing it in a password manager, if you use one)
- in
- head back to
User Secretswhere we'll paste it into a new secret viaAdd secret > API credentials- indescriptionenterCodeocean API (read)- inAPI keyenterCODE_OCEAN_API_KEY- inAPI secretpaste the copied secret from before (should start withcop_...)CODE_OCEAN_DOMAINis the codeocean https address, up to and including.org
- right click on
- AWS S3
- environment variables can also be specified in a file named
.envin the current working directory- example: https://www.dotenv.org/docs/security/env.html
- be very careful that this file does not get pushed to public locations, e.g. github
- if using git, add it to a
.gitignorefile in your project's root directory:
.env*
- if using git, add it to a
- required environment variables:
-
now in Python we can find sessions that are available to work with:
>>> import npc_lims; # get a sequence of `SessionInfo` dataclass instances, one per session: >>> tracked_sessions: tuple[npc_lims.SessionInfo, ...] = npc_lims.get_session_info() # each `SessionInfo` instance has minimal metadata about its session: >>> tracked_sessions[0] # doctest: +SKIP npc_lims.SessionInfo(id='626791_2022-08-15', subject=626791, date='2022-08-15', idx=0, project='DRPilotSession', is_ephys=True, is_sync=True, allen_path=PosixUPath('//allen/programs/mindscope/workgroups/dynamicrouting/PilotEphys/Task 2 pilot/DRpilot_626791_20220815')) >>> tracked_sessions[0].is_ephys # doctest: +SKIP False # currently, we're only tracking behavior and ephys sessions that use variants of https://github.com/samgale/DynamicRoutingTask/blob/main/TaskControl.py: >>> all(s.date.year >= 2022 for s in tracked_sessions) True
-
"tracked sessions" are discovered via 3 routes:
- https://github.com/AllenInstitute/npc_lims/blob/main/tracked_sessions.yaml
\\allen\programs\mindscope\workgroups\dynamicrouting\DynamicRoutingTask\DynamicRoutingTraining.xlsx\\allen\programs\mindscope\workgroups\dynamicrouting\DynamicRoutingTask\DynamicRoutingTrainingNSB.xlsx
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file npc_lims-0.1.190.tar.gz.
File metadata
- Download URL: npc_lims-0.1.190.tar.gz
- Upload date:
- Size: 40.5 kB
- Tags: Source
- Uploaded using Trusted Publishing? Yes
- Uploaded via: pdm/2.23.1.dev4+g0fabc96 CPython/3.9.21 Linux/6.8.0-1021-azure
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
735aca3195c48771b9738d9b344501181a4ca518819f946e3d44841539b657ef
|
|
| MD5 |
5e3c6696d5ee599b0beaf094e1cd63f0
|
|
| BLAKE2b-256 |
86831efdd65c5c39946a739d9a98ee622ece5d4f558528fe02736b7f3a5d59b6
|
File details
Details for the file npc_lims-0.1.190-py3-none-any.whl.
File metadata
- Download URL: npc_lims-0.1.190-py3-none-any.whl
- Upload date:
- Size: 46.8 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? Yes
- Uploaded via: pdm/2.23.1.dev4+g0fabc96 CPython/3.9.21 Linux/6.8.0-1021-azure
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
6a4e96d3c42a6c5bc85089e2f3273fdbad4d637e1617f8740e0bda4181099b97
|
|
| MD5 |
fc2c454003f0cfd0796050f1aa70bf0d
|
|
| BLAKE2b-256 |
ad6cb1bb6e8b71aa9d0191383fd5eb041dcbb8c00d0acb60e543c30579b59555
|