Skip to main content

CLI and utils for the Neuro-iX lab

Project description

Neuro-iX Tools

Common Tools for the Neuro-iX Lab

Getting Started

Installation

You will need an environment with at least Python 3.11, then run:

pip install neuro-ix-tool

Alternatively, you can clone the repository and use:

python cli.py

instead of neuro-ix.

Setup

If you are using the package, simply run:

neuro-ix init

This provides sensible defaults for Narval. The configuration file is stored in your .config folder.

If you are using the repository directly, we recommend using a local .env file. A template is available at .example.env.

Usage

Inside this environment, you have access to the neuro-ix command, which currently exposes one main tool.

FreeSurfer recon-all on SLURM Cluster

We provide a pipeline that simplifies the usage of FreeSurfer on the Narval SLURM cluster. The main command is:

neuro-ix freesurfer recon-all

This allows users to process all subjects in either a BIDS or CAPS (Clinica) dataset with FreeSurfer, using one SLURM job per subject.

Arguments:
  • --bids-dataset: Path to the root of a BIDS-compliant dataset
  • --clinica-dataset: Path to the root of a Clinica-compliant dataset (CAPS)
  • --cortical-stats: Flag to store only FreeSurfer's stats files
  • --start-from: Used when there are more than 1000 subjects due to Narval’s job limit. Allows the user to resume processing from a specific subject index.
Example:
neuro-ix freesurfer recon-all --bids-dataset /path/to/dataset

If your dataset includes more than 1000 subjects (e.g., 1500), once the first batch is done, run:

neuro-ix freesurfer recon-all --bids-dataset /path/to/dataset --start-from 1000

Library

As a library, the neuro_ix package exposes:

  • Classes to interact with and query BIDS and CAPS datasets for T1-weighted MRIs
  • Extendable command classes

Contributing

Setup

Once the repository is cloned, install the development dependencies with:

pip install -r dev_requirements.txt

Tests

Test Tools

We use:

  • pytest for unit tests
  • pytest-cov for coverage reports
    Run tests via:
pytest --cov
  • ruff for linting and formatting (automatically applied via pre-commit)
  • Additional tools for code quality: ssort, pydocstyle, mypy, and pylint

Test Data

All test data are extracted from MR-ART:

Nárai, Á., Hermann, P., Auer, T. et al. Movement-related artefacts (MR-ART) dataset of matched motion-corrupted and clean structural MRI brain scans. Sci Data 9, 630 (2022). https://doi.org/10.1038/s41597-022-01694-8

Deployment

Build Package using :

python -m build

And deploy to PyPI with :

twine upload sit/*

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

neuro_ix_tools-0.0.7.tar.gz (11.6 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

neuro_ix_tools-0.0.7-py3-none-any.whl (11.3 kB view details)

Uploaded Python 3

File details

Details for the file neuro_ix_tools-0.0.7.tar.gz.

File metadata

  • Download URL: neuro_ix_tools-0.0.7.tar.gz
  • Upload date:
  • Size: 11.6 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.1.0 CPython/3.11.11

File hashes

Hashes for neuro_ix_tools-0.0.7.tar.gz
Algorithm Hash digest
SHA256 fe9a87c9a86084f9accd4e0a3f8bbc49f8928970fb2b0a62c764ca68d174bfe7
MD5 649d636fbb033e7df849bdd5002b17df
BLAKE2b-256 144ff8ae3360ea592223754e64543a53f69ff20539ef655385f722b34fda4ea9

See more details on using hashes here.

File details

Details for the file neuro_ix_tools-0.0.7-py3-none-any.whl.

File metadata

  • Download URL: neuro_ix_tools-0.0.7-py3-none-any.whl
  • Upload date:
  • Size: 11.3 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.1.0 CPython/3.11.11

File hashes

Hashes for neuro_ix_tools-0.0.7-py3-none-any.whl
Algorithm Hash digest
SHA256 921bbbbd532ed7a27c692404c31139a78c506fa3af437ccc927807a4eaa91676
MD5 41e0b9d8cfbacbf51007e2cd18793d1b
BLAKE2b-256 a1a9e0a07870840a22768c0dd699f9193f24ae2c5ab7b0eed0840eac0f9992bd

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page