Skip to main content

Tool for preprocessing tasks in biomedical imaging, with a focus on (but not limited to) multi-modal brain MRI

Project description

BrainLes-Preprocessing

Python Versions Stable Version Documentation Status tests License

BrainLes preprocessing is a comprehensive tool for preprocessing tasks in biomedical imaging, with a focus on (but not limited to) multi-modal brain MRI. It can be used to build modular preprocessing pipelines:

This includes normalization, co-registration, atlas registration and skulstripping / brain extraction.

BrainLes is written backend-agnostic meaning it allows to swap the registration, brain extraction tools and defacing tools.

Installation

With a Python 3.10+ environment you can install directly from pypi.org:

pip install brainles-preprocessing

We recommend using Python 3.10 / 3.11 / 3.12.

[!NOTE]
For python 3.13 the installation can currently fail with the error Failed to build antspyx. This usually means that there is no pre-built wheel for the package and it has to be build locally. This will require cmake (install e.g. with pip install cmake) and quite some time. Rerunning the installation with cmake installed should fix the error.

Usage

A minimal example to register (to the standard atlas using ANTs) and skull strip (using HDBet) a t1c image (center modality) with 1 moving modality (flair) could look like this:

from pathlib import Path
from brainles_preprocessing.modality import Modality, CenterModality
from brainles_preprocessing.normalization.percentile_normalizer import (
    PercentileNormalizer,
)
from brainles_preprocessing.preprocessor import Preprocessor

patient_folder = Path("/home/marcelrosier/preprocessing/patient")

# specify a normalizer
percentile_normalizer = PercentileNormalizer(
    lower_percentile=0.1,
    upper_percentile=99.9,
    lower_limit=0,
    upper_limit=1,
)

# define center and moving modalities
center = CenterModality(
    modality_name="t1c",
    input_path=patient_folder / "t1c.nii.gz",
    normalizer=percentile_normalizer,
    # specify the output paths for the raw and normalized images of each step - here only for atlas registered and brain extraction
    raw_skull_output_path="patient/raw_skull_dir/t1c_skull_raw.nii.gz",
    raw_bet_output_path="patient/raw_bet_dir/t1c_bet_raw.nii.gz",
    raw_defaced_output_path="patient/raw_defaced_dir/t1c_defaced_raw.nii.gz",
    normalized_skull_output_path="patient/norm_skull_dir/t1c_skull_normalized.nii.gz",
    normalized_bet_output_path="patient/norm_bet_dir/t1c_bet_normalized.nii.gz",
    normalized_defaced_output_path="patient/norm_defaced_dir/t1c_defaced_normalized.nii.gz",
    # specify output paths for the brain extraction and defacing masks
    bet_mask_output_path="patient/masks/t1c_bet_mask.nii.gz",
    defacing_mask_output_path="patient/masks/t1c_defacing_mask.nii.gz",
)

moving_modalities = [
    Modality(
        modality_name="flair",
        input_path=patient_folder / "flair.nii.gz",
        normalizer=percentile_normalizer,
        # specify the output paths for the raw and normalized images of each step - here only for atlas registered and brain extraction
        raw_skull_output_path="patient/raw_skull_dir/fla_skull_raw.nii.gz",
        raw_bet_output_path="patient/raw_bet_dir/fla_bet_raw.nii.gz",
        raw_defaced_output_path="patient/raw_defaced_dir/fla_defaced_raw.nii.gz",
        normalized_skull_output_path="patient/norm_skull_dir/fla_skull_normalized.nii.gz",
        normalized_bet_output_path="patient/norm_bet_dir/fla_bet_normalized.nii.gz",
        normalized_defaced_output_path="patient/norm_defaced_dir/fla_defaced_normalized.nii.gz",
    )
]

# instantiate and run the preprocessor using defaults for registration/ brain extraction/ defacing backends
preprocessor = Preprocessor(
    center_modality=center,
    moving_modalities=moving_modalities,
)

preprocessor.run()

The package allows to choose registration backends, brain extraction tools and defacing methods.
An example notebook with 4 modalities and further outputs and customizations can be found following these badges:

nbviewer Open In Colab

For further information please have a look at our Jupyter Notebook tutorials in our tutorials repo (WIP).

Documentation

We provide a (WIP) documentation. Have a look here

FAQ

Please credit the authors by citing their work.

Registration

We currently provide support for ANTs (default), Niftyreg (Linux), eReg (experimental)

Atlas Reference

We provide the SRI-24 atlas from this publication. However, custom atlases in NIfTI format are supported.

Brain extraction

We currently provide support for HD-BET.

Defacing

We currently provide support for Quickshear.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

brainles_preprocessing-0.3.5.tar.gz (19.6 MB view details)

Uploaded Source

Built Distribution

brainles_preprocessing-0.3.5-py3-none-any.whl (19.7 MB view details)

Uploaded Python 3

File details

Details for the file brainles_preprocessing-0.3.5.tar.gz.

File metadata

File hashes

Hashes for brainles_preprocessing-0.3.5.tar.gz
Algorithm Hash digest
SHA256 ecffe0d999ae06b36ca6447f5d7e2c5a7d4f0ccc4d53c4d4910ad19ddc9ce216
MD5 e01efc260aecaba11ef1fc6e64297a00
BLAKE2b-256 06ef910f40b9cef0b9d07152af93d26d7cbc07efd605990bbaf22c0ea3dd73ca

See more details on using hashes here.

File details

Details for the file brainles_preprocessing-0.3.5-py3-none-any.whl.

File metadata

File hashes

Hashes for brainles_preprocessing-0.3.5-py3-none-any.whl
Algorithm Hash digest
SHA256 80de2239802a91cab27bcc3bdbbdac27ffec4b7bb3bf1b86f83fb5209181d842
MD5 6ccfe797269cfe36307dea33bb38394f
BLAKE2b-256 944e284d34e6e8127c36c835cb372f946d0f95c444af26b2ea58ce6ea9703094

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page