Skip to main content

CLI and utils for the Neuro-iX lab

Project description

Neuro-iX Tools

Common Tools for Neuro-iX lab

Getting started

Install

You will need an environment with at least python 3.11 and run :

pip install neuro-ix-tool

Otherwise, you can clone the repository and use :

python cli.py

instead of neuro-ix.

Setup

If you are using the package, you can just run :

neuro-ix init

which provide sane defaults for Narval. The configuration file is stored in your .config folder.

If you are using the repository, we advise you to use a local .env file by using the template available in .example.env.

Usage

Inside this envrionment you have access to the neuro-ix command exposing one principal tool (for now).

FreeSurfer recon-all on SLURM cluster

For the usage of Neuro-iX, we defined a pipeline simplifying the usage of FreeSurfer for the Narval SLURM cluster. The main command is

neuro-ix freesurfer recon-all

It allows the user to process all subject in either a BIDS or CAPS (Clinica) with FreeSurfer using one SLURM job per subjects.

which accepts the following arguments :

  • --bids-dataset to specify the path to the root of a BIDS compliant dataset
  • --clinica-dataset to specify the path to the root of a Clinica compliant dataset (CAPS)
  • --cortical-stats a flag to only store the FreeSurfer's stats files
  • --start-from used in case there is more than 1000 subjects because of Narval jobs limit. It allows the user to determine at which subject index he wants to restart the command.

Example :

neuro-ix freesurfer recon-all --bids-dataset /path/to/dataset

And if the dataset include 1500 subjects, you will need to run, once all jobs are finished :

neuro-ix freesurfer recon-all --bids-dataset /path/to/dataset --start-from 1000

Library

As a library, the neuro_ix package exposes :

  • Classes to interact and query both BIDS and CAPS datasets for T1w MRIs
  • Extendqble command classes

Contribute

Setup

Once cloned, to get all necessary developement packages, you have to run :

pip install -r dev_requirements.txt

Tests

Test tools

We use pytest for our unit tests and pytest-cov for coverage. You can easily run them in VSCode or with the following commands :

pytest --cov

We use ruff for linting and formating, which is automatically applied with precommit. We also use ssort, pydocstyle, mypy and pylint to assure consistent code quality.

Test data

All test data are extracted from MR-ART :

Nárai, Á., Hermann, P., Auer, T. et al. Movement-related artefacts (MR-ART) dataset of matched motion-corrupted and clean structural MRI brain scans. Sci Data 9, 630 (2022). https://doi.org/10.1038/s41597-022-01694-8

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

neuro_ix_tools-0.0.4.tar.gz (11.5 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

neuro_ix_tools-0.0.4-py3-none-any.whl (11.2 kB view details)

Uploaded Python 3

File details

Details for the file neuro_ix_tools-0.0.4.tar.gz.

File metadata

  • Download URL: neuro_ix_tools-0.0.4.tar.gz
  • Upload date:
  • Size: 11.5 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.1.0 CPython/3.11.11

File hashes

Hashes for neuro_ix_tools-0.0.4.tar.gz
Algorithm Hash digest
SHA256 19559cd1d263022e5fee89a5fdcfb07b49ad7504c4911694a0af7fdb14e95133
MD5 33c371e6a500d1b7231cf76eee3fd466
BLAKE2b-256 e919813fc4c841ba7f47d40bad1817ca61e57c8135960e74b5d4e0fb631b36dd

See more details on using hashes here.

File details

Details for the file neuro_ix_tools-0.0.4-py3-none-any.whl.

File metadata

  • Download URL: neuro_ix_tools-0.0.4-py3-none-any.whl
  • Upload date:
  • Size: 11.2 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.1.0 CPython/3.11.11

File hashes

Hashes for neuro_ix_tools-0.0.4-py3-none-any.whl
Algorithm Hash digest
SHA256 ba657c076ce202a7d7c153cecaf84bf97ceaee273afe30e24853e334d609e0cd
MD5 4f490461bb544788207e26cd79e08ad8
BLAKE2b-256 b03a5482d7472370417e5df0e1209eb375a1bc7845050167edd575c1324d8482

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page