Skip to main content

Tool for preprocessing tasks in biomedical imaging, with a focus on (but not limited to) multi-modal brain MRI

Project description

BrainLes-Preprocessing

Python Versions Stable Version Documentation Status tests License

BrainLes preprocessing is a comprehensive tool for preprocessing tasks in biomedical imaging, with a focus on (but not limited to) multi-modal brain MRI. It can be used to build modular preprocessing pipelines:

This includes normalization, co-registration, atlas registration and skulstripping / brain extraction.

BrainLes is written backend-agnostic meaning it allows to swap the registration, brain extraction tools and defacing tools.

Installation

With a Python 3.10+ environment you can install directly from pypi.org:

pip install brainles-preprocessing

Usage

A minimal example to register (to the standard atlas using ANTs) and skull strip (using HDBet) a t1c image (center modality) with 1 moving modality (flair) could look like this:

from pathlib import Path
from brainles_preprocessing.modality import Modality, CenterModality
from brainles_preprocessing.normalization.percentile_normalizer import (
    PercentileNormalizer,
)
from brainles_preprocessing.preprocessor import Preprocessor

patient_folder = Path("/home/marcelrosier/preprocessing/patient")

# specify a normalizer
percentile_normalizer = PercentileNormalizer(
    lower_percentile=0.1,
    upper_percentile=99.9,
    lower_limit=0,
    upper_limit=1,
)

# define center and moving modalities
center = CenterModality(
    modality_name="t1c",
    input_path=patient_folder / "t1c.nii.gz",
    normalizer=percentile_normalizer,
    # specify the output paths for the raw and normalized images of each step - here only for atlas registered and brain extraction
    raw_skull_output_path="patient/raw_skull_dir/t1c_skull_raw.nii.gz",
    raw_bet_output_path="patient/raw_bet_dir/t1c_bet_raw.nii.gz",
    normalized_skull_output_path="patient/norm_skull_dir/t1c_skull_normalized.nii.gz",
    normalized_bet_output_path="patient/norm_bet_dir/t1c_bet_normalized.nii.gz",
    # specify output paths for the brain extraction and defacing masks
    bet_mask_output_path="patient/masks/t1c_bet_mask.nii.gz",
    defacing_mask_output_path="patient/masks/t1c_defacing_mask.nii.gz",
)

moving_modalities = [
    Modality(
        modality_name="flair",
        input_path=patient_folder / "flair.nii.gz",
        normalizer=percentile_normalizer,
        # specify the output paths for the raw and normalized images of each step - here only for atlas registered and brain extraction
        raw_skull_output_path="patient/raw_skull_dir/fla_skull_raw.nii.gz",
        raw_bet_output_path="patient/raw_bet_dir/fla_bet_raw.nii.gz",
        normalized_skull_output_path="patient/norm_skull_dir/fla_skull_normalized.nii.gz",
        normalized_bet_output_path="patient/norm_bet_dir/fla_bet_normalized.nii.gz",
    )
]

# instantiate and run the preprocessor using defaults for registration/ brain extraction/ defacing backends
preprocessor = Preprocessor(
    center_modality=center,
    moving_modalities=moving_modalities,
)

preprocessor.run()

The package allows to choose registration backends, brain extraction tools and defacing methods.
An example notebook with 4 modalities and further outputs and customizations can be found following these badges:

nbviewer Open In Colab

For further information please have a look at our Jupyter Notebook tutorials in our tutorials repo (WIP).

Documentation

We provide a (WIP) documentation. Have a look here

FAQ

Please credit the authors by citing their work.

Registration

We currently provide support for ANTs (default), Niftyreg (Linux), eReg (experimental)

Atlas Reference

We provide the SRI-24 atlas from this publication. However, custom atlases in NIfTI format are supported.

Brain extraction

We currently provide support for HD-BET.

Defacing

We currently provide support for Quickshear.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distributions

No source distribution files available for this release.See tutorial on generating distribution archives.

Built Distribution

brainles_preprocessing-0.3.0-py3-none-any.whl (19.7 MB view details)

Uploaded Python 3

File details

Details for the file brainles_preprocessing-0.3.0-py3-none-any.whl.

File metadata

File hashes

Hashes for brainles_preprocessing-0.3.0-py3-none-any.whl
Algorithm Hash digest
SHA256 b6830aa9655ce5e5c6341f70bb4730175fb15c0c1b11864285af358be4bf6922
MD5 47b8e4c9aaf9eb151a4bced2eeaaf599
BLAKE2b-256 e5aea91a9d85f1f2a0790d51a33e78ae0fabcab8f589791042d03738d9606709

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page