Tools to fetch and update paths, metadata and state for Mindscope Neuropixels sessions, in the cloud.
Project description
npc_lims
neuropixels cloud lab information management system Tools to fetch and update paths, metadata and state for Mindscope Neuropixels sessions, in the cloud.
quickstart
-
make a new Python >=3.9 virtual environment with conda or venv (lighter option, since this package does not require pandas, numpy etc.):
python -m venv .venv
-
activate the virtual environment:
- Windows
.venv\scripts\activate
- Unix
source .venv/bin/activate.sh
-
install the package:
python -m pip install npc_lims
-
setup credentials
- required environment variables:
- AWS S3
AWS_DEFAULT_REGION
AWS_ACCESS_KEY_ID
AWS_SECRET_ACCESS_KEY
- to find and read files on S3
- must have read access on relevant aind buckets
- can be in a standard
~/.aws
location, as used by AWS CLI or boto3
- CodeOcean API
CODE_OCEAN_API_TOKEN
CODE_OCEAN_DOMAIN
- to find processed data in "data assets" via the Codeocean API
- generated in CodeOcean:
- right click on
Account
(bottom left, person icon) - click
User Secrets
- these are secrets than can be made available as environment variables in CodeOcean capsules - go to
Access Tokens
and clickGenerate new token
- this is for programatically querying CodeOcean's databases- in
Token Name
enterCodeocean API (read)
and checkread
on capsules and datasets - a token will be generated: click copy (storing it in a password manager, if you use one)
- in
- head back to
User Secrets
where we'll paste it into a new secret viaAdd secret > API credentials
- indescription
enterCodeocean API (read)
- inAPI key
enterCODE_OCEAN_API_KEY
- inAPI secret
paste the copied secret from before (should start withcop_
...)CODE_OCEAN_DOMAIN
is the codeocean https address, up to and including.org
- right click on
- AWS S3
- environment variables can also be specified in a file named
.env
in the current working directory- example: https://www.dotenv.org/docs/security/env.html
- be very careful that this file does not get pushed to public locations, e.g. github
- if using git, add it to a
.gitignore
file in your project's root directory:
.env*
- if using git, add it to a
- required environment variables:
-
now in Python we can find sessions that are available to work with:
>>> import npc_lims; # get a sequence of `SessionInfo` dataclass instances, one per session: >>> tracked_sessions: tuple[npc_lims.SessionInfo, ...] = npc_lims.get_session_info() # each `SessionInfo` instance has minimal metadata about its session: >>> tracked_sessions[0] # doctest: +SKIP npc_lims.SessionInfo(id='626791_2022-08-15', subject=626791, date='2022-08-15', idx=0, project='DRPilotSession', is_ephys=True, is_sync=True, allen_path=PosixUPath('//allen/programs/mindscope/workgroups/dynamicrouting/PilotEphys/Task 2 pilot/DRpilot_626791_20220815')) >>> tracked_sessions[0].is_ephys # doctest: +SKIP False # currently, we're only tracking behavior and ephys sessions that use variants of https://github.com/samgale/DynamicRoutingTask/blob/main/TaskControl.py: >>> all(s.date.year >= 2022 for s in tracked_sessions) True
-
"tracked sessions" are discovered via 3 routes:
- https://github.com/AllenInstitute/npc_lims/blob/main/tracked_sessions.yaml
\\allen\programs\mindscope\workgroups\dynamicrouting\DynamicRoutingTask\DynamicRoutingTraining.xlsx
\\allen\programs\mindscope\workgroups\dynamicrouting\DynamicRoutingTask\DynamicRoutingTrainingNSB.xlsx
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
npc_lims-0.1.185.tar.gz
(40.2 kB
view hashes)
Built Distribution
npc_lims-0.1.185-py3-none-any.whl
(46.5 kB
view hashes)
Close
Hashes for npc_lims-0.1.185-py3-none-any.whl
Algorithm | Hash digest | |
---|---|---|
SHA256 | 0f6d25ca762654829c4c031d443ff461295e0f546708cfc9994390847705ab1a |
|
MD5 | b7c8508449a2c257c87e9bc2df373cbe |
|
BLAKE2b-256 | e31b7352f369d6ab738eeb00a4c5176c731fe76aa1bfc1227fc8cefdd4669fab |