Skip to main content

Tool for preprocessing tasks in biomedical imaging, with a focus on (but not limited to) multi-modal brain MRI

Project description

BrainLes-Preprocessing

Python Versions Stable Version Documentation Status tests License

BrainLes preprocessing is a comprehensive tool for preprocessing tasks in biomedical imaging, with a focus on (but not limited to) multi-modal brain MRI. It can be used to build modular preprocessing pipelines:

This includes normalization, co-registration, atlas registration, skullstripping / brain extraction, N4 Bias correction and defacing. We provide means to transform images and segmentations in both directions between native and atlas space.

BrainLes is written modular and backend-agnostic meaning it allows to skip or swap registration, brain extraction, N4 bias correction and defacing tools.

Installation

With a Python 3.10+ environment you can install directly from pypi.org:

pip install brainles-preprocessing

We recommend using Python 3.10 / 3.11 / 3.12.

[!NOTE]
For python 3.13 the installation can currently fail with the error Failed to build antspyx. This usually means that there is no pre-built wheel for the package and it has to be build locally. This will require cmake (install e.g. with pip install cmake) and quite some time. Rerunning the installation with cmake installed should fix the error.

Usage

A minimal example to register (to the standard atlas using ANTs) and skull strip (using HDBet) a t1c image (center modality) with 1 moving modality (flair) could look like this:

from pathlib import Path
from brainles_preprocessing.modality import Modality, CenterModality
from brainles_preprocessing.normalization.percentile_normalizer import (
    PercentileNormalizer,
)
from brainles_preprocessing.preprocessor import Preprocessor

patient_folder = Path("/home/marcelrosier/preprocessing/patient")

# specify a normalizer
percentile_normalizer = PercentileNormalizer(
    lower_percentile=0.1,
    upper_percentile=99.9,
    lower_limit=0,
    upper_limit=1,
)

# define center and moving modalities
center = CenterModality(
    modality_name="t1c",
    input_path=patient_folder / "t1c.nii.gz",
    normalizer=percentile_normalizer,
    # specify the output paths for the raw and normalized images of each step - here only for atlas registered and brain extraction
    raw_skull_output_path="patient/raw_skull_dir/t1c_skull_raw.nii.gz",
    raw_bet_output_path="patient/raw_bet_dir/t1c_bet_raw.nii.gz",
    raw_defaced_output_path="patient/raw_defaced_dir/t1c_defaced_raw.nii.gz",
    normalized_skull_output_path="patient/norm_skull_dir/t1c_skull_normalized.nii.gz",
    normalized_bet_output_path="patient/norm_bet_dir/t1c_bet_normalized.nii.gz",
    normalized_defaced_output_path="patient/norm_defaced_dir/t1c_defaced_normalized.nii.gz",
    # specify output paths for the brain extraction and defacing masks
    bet_mask_output_path="patient/masks/t1c_bet_mask.nii.gz",
    defacing_mask_output_path="patient/masks/t1c_defacing_mask.nii.gz",
)

moving_modalities = [
    Modality(
        modality_name="flair",
        input_path=patient_folder / "flair.nii.gz",
        normalizer=percentile_normalizer,
        # specify the output paths for the raw and normalized images of each step - here only for atlas registered and brain extraction
        raw_skull_output_path="patient/raw_skull_dir/fla_skull_raw.nii.gz",
        raw_bet_output_path="patient/raw_bet_dir/fla_bet_raw.nii.gz",
        raw_defaced_output_path="patient/raw_defaced_dir/fla_defaced_raw.nii.gz",
        normalized_skull_output_path="patient/norm_skull_dir/fla_skull_normalized.nii.gz",
        normalized_bet_output_path="patient/norm_bet_dir/fla_bet_normalized.nii.gz",
        normalized_defaced_output_path="patient/norm_defaced_dir/fla_defaced_normalized.nii.gz",
    )
]

# instantiate and run the preprocessor using defaults for backends (registration, brain extraction, bias correction, defacing)
preprocessor = Preprocessor(
    center_modality=center,
    moving_modalities=moving_modalities,
)

preprocessor.run()

The package allows to choose registration backends, brain extraction tools and defacing methods.
An example notebook with 4 modalities and further outputs and customizations can be found following these badges:

nbviewer Open In Colab

For further information please have a look at our Jupyter Notebook tutorials in our tutorials repo (WIP).

Documentation

We provide a (WIP) documentation. Have a look here

FAQ

Please credit the authors by citing their work.

Registration

We currently provide support for ANTs (default), Niftyreg (Linux).

Atlas Reference

We provide the SRI-24 atlas from this publication. However, custom atlases in NIfTI format are supported.

N4 Bias correction

We currently provide support for N4 Bias correction based on SimpleITK

Brain extraction

We currently provide support for HD-BET.

Defacing

We currently provide support for Quickshear.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

brainles_preprocessing-0.5.0.tar.gz (8.2 MB view details)

Uploaded Source

Built Distribution

brainles_preprocessing-0.5.0-py3-none-any.whl (8.3 MB view details)

Uploaded Python 3

File details

Details for the file brainles_preprocessing-0.5.0.tar.gz.

File metadata

  • Download URL: brainles_preprocessing-0.5.0.tar.gz
  • Upload date:
  • Size: 8.2 MB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.1.0 CPython/3.13.5

File hashes

Hashes for brainles_preprocessing-0.5.0.tar.gz
Algorithm Hash digest
SHA256 4788e6fb2d3b40961d3ac62f2964cb327fc981cb01ca12a50122a5c8cbe1ac52
MD5 35f3122a613d0ecdf699eea8e1a7ddeb
BLAKE2b-256 7b701a57314be6fd25da71512daaa8ad2157eb742627794ad0086a5a15ddcdf8

See more details on using hashes here.

File details

Details for the file brainles_preprocessing-0.5.0-py3-none-any.whl.

File metadata

File hashes

Hashes for brainles_preprocessing-0.5.0-py3-none-any.whl
Algorithm Hash digest
SHA256 209371b8255e1b308ab7d05b4939ed4731291fc7ee4d73fc46a952934bbec494
MD5 47e2fa97f965ccf5d1b6623520c72c6b
BLAKE2b-256 0d9f488e29b5da4b11588a9bef9f4b08ce1ab7ac721b7fee285630bf189f6424

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page