Skip to main content

MICAFlow MRI processing pipeline

Project description

MICAFlow logo

MICAFlow: Accessible, fast, and pythonic MRI processing pipeline

Version PyPI version PyPI downloads GitHub issues GitHub stars

MICAFlow is a comprehensive neuroimaging pipeline designed for processing structural and diffusion MRI data. It offers modular components that can be used as part of a cohesive workflow or as standalone tools.

Overview

MICAFlow provides a robust and flexible framework for neuroimaging processing. By chaining together deep learning-based segmentation and advanced numerical solutions, it generates precise outputs even for modalities with low signal-to-noise ratio or strong geometric distortions.

Features

  • Structural MRI Processing: T1w and FLAIR image processing
  • Diffusion MRI Processing: Complete pipeline for DWI processing
  • Brain Extraction: SynthSeg-based brain extraction with optional cerebellum removal
  • Deep Learning Segmentation: SynthSeg for brain segmentation and parcellation
  • Image Registration: Multi-modal coregistration and spatial normalization to standard spaces
  • Texture Features: Advanced texture feature generation
  • Quality Control: Built-in QC metrics and visualization
  • Batch Processing: Automated BIDS directory scanning and processing
  • Brain-Extracted Outputs: Optional dedicated directory for all skull-stripped images
  • Temporary File Management: Option to preserve intermediate files for debugging
  • Modular Design: Components can be used independently or as a complete pipeline
  • Flexible Configuration: Via command line arguments or YAML configuration files

Installation

pip install 
# Verify installation
micaflow

Usage

MICAFlow can be used as a complete pipeline or as individual modules:

Running the Full Pipeline

# Basic usage with T1w only
micaflow pipeline --subject sub-001 --session ses-01 \
  --data-directory /path/to/data --t1w-file sub-001_ses-01_T1w.nii.gz \
  --output /output/path --cores 4

# With FLAIR image
micaflow pipeline --subject sub-001 --session ses-01 \
  --data-directory /path/to/data --t1w-file sub-001_ses-01_T1w.nii.gz \
  --flair-file sub-001_ses-01_FLAIR.nii.gz --output /output/path \
  --cores 4

# With diffusion data
micaflow pipeline --subject sub-001 --session ses-01 \
  --data-directory /path/to/data --t1w-file sub-001_ses-01_T1w.nii.gz \
  --dwi-file sub-001_ses-01_dwi.nii.gz \
  --bval-file sub-001_ses-01_dwi.bval --bvec-file sub-001_ses-01_dwi.bvec \
  --inverse-dwi-file sub-001_ses-01_acq-PA_dwi.nii.gz \
  --output /output/path --cores 4

Batch Processing (BIDS)

To process an entire BIDS dataset automatically using the batch command:

micaflow bids --bids-dir /path/to/bids_root --output-dir /path/to/derivatives \
  --cores 4 --gpu

This command will:

  1. Scan the BIDS directory for valid subjects and sessions.
  2. Automatically identify T1w, FLAIR (optional), and DWI (optional) files based on suffixes.
  3. Run the pipeline sequentially for each session found.
  4. Generate a micaflow_runs_summary.json in the output directory tracking execution status.

key arguments:

  • --bids-dir: Root path to the BIDS dataset.
  • --output-dir: Path where derivatives will be saved.
  • --participant-label: (Optional) Space-separated list of subject IDs to process (e.g., 001 002).
  • --session-label: (Optional) Space-separated list of session IDs to process.
  • --t1w-suffix, --dwi-suffix, etc.: Customize matching patterns for input files.

Using Individual Modules

Each module can be used independently:

Brain Extraction

micaflow bet --input t1w.nii.gz --output brain.nii.gz --parcellation segmentation.nii.gz --output-mask mask.nii.gz

Brain Segmentation (SynthSeg)

micaflow synthseg --i t1w.nii.gz --o segmentation.nii.gz --parc --fast --threads 4

Image Registration

micaflow coregister --fixed-file target.nii.gz --moving-file source.nii.gz \
  --output registered.nii.gz --warp-file warp.nii.gz --affine-file affine.mat

Apply Transformations

micaflow apply_warp --moving image.nii.gz --reference target.nii.gz \
  --warp warp.nii.gz --affine affine.mat --output warped.nii.gz

Diffusion Processing

# Denoise DWI data
micaflow denoise --input dwi.nii.gz --bval dwi.bval --bvec dwi.bvec --output denoised_dwi.nii.gz

# Motion correction
micaflow motion_correction --denoised denoised_dwi.nii.gz --input-bvecs dwi.bvec --output-bvecs corrected.bvec --output motion_corrected_dwi.nii.gz

# Susceptibility distortion correction
micaflow SDC --input motion_corrected_dwi.nii.gz --reverse-image reverse_phase_dwi.nii.gz \
  --output corrected_dwi.nii.gz --output-warp sdc_warpfield.nii.gz

# Compute DTI metrics
micaflow compute_fa_md --input preprocessed_dwi.nii.gz --bval dwi.bval --bvec dwi.bvec \
  --output-fa fa_map.nii.gz --output-md md_map.nii.gz

Texture Feature Extraction

micaflow texture_generation --input image.nii.gz --mask mask.nii.gz --output texture_features

Pipeline Workflow

The pipeline performs the following processing steps:

  1. Brain Extraction: Using SynthSeg-based segmentation for T1w, FLAIR, and DWI
    • Optionally remove cerebellum with --rm-cerebellum
  2. Bias Field Correction: Using N4 bias field correction
  3. Brain Segmentation: Using SynthSeg for T1w and FLAIR
  4. Registration:
    • FLAIR to T1w space
    • T1w to MNI152 standard space
  5. DWI Processing (if enabled):
    • Denoising with Patch2Self
    • Motion correction
    • Susceptibility distortion correction
    • Tensor fitting and DTI metrics calculation
    • Registration to T1w space
  6. Texture Feature Generation: On normalized images
  7. Brain-Extracted Outputs (if --extract-brain enabled):
    • Creates skull-stripped versions of all outputs in dedicated directory
    • Normalized versions also created
  8. Quality Control: Calculating quality metrics
  9. Cleanup: Removes temporary files (unless --keep-temp is specified)

Output Structure

The pipeline creates a structured output directory:

output/
├── <subject>/
│   └── <session>/
│       ├── anat/             # Anatomical images (brain-extracted, bias-corrected)
│       ├── dwi/              # Processed diffusion data and DTI metrics
│       ├── metrics/          # Quality metrics and DICE scores
│       ├── textures/         # Texture features
│       └── xfm/              # Transformation matrices and warps

Configuration

MICAFlow can be configured via:

  1. Command Line Arguments: For quick setup and individual module usage
  2. Configuration File: YAML file for complex setups (specify with --config-file)
  3. Default Configuration: Located in config.yaml

System Requirements

  • CPU: Multi-core recommended for parallel processing
  • RAM: 8GB minimum, 16GB+ recommended
  • GPU: Optional but recommended for faster processing (CUDA compatible)
  • Disk Space: Depends on dataset size, minimum 10GB recommended

Supported Image Formats

  • NIfTI (.nii, .nii.gz)
  • BIDS-compatible directory structures

Support and Contact

For issues, questions or feature requests, please open an issue on the GitHub repository.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

micaflow-1.0.3.tar.gz (128.0 kB view details)

Uploaded Source

Built Distributions

If you're not sure about the file name format, learn more about wheel file names.

micaflow-1.0.3-cp310-cp312-win_amd64.whl (139.1 kB view details)

Uploaded CPython 3.12Windows x86-64

micaflow-1.0.3-cp310-cp312-manylinux_2_17_x86_64.whl (138.8 kB view details)

Uploaded CPython 3.12manylinux: glibc 2.17+ x86-64

micaflow-1.0.3-cp310-cp312-macosx_10_9_x86_64.whl (138.8 kB view details)

Uploaded CPython 3.12macOS 10.9+ x86-64

File details

Details for the file micaflow-1.0.3.tar.gz.

File metadata

  • Download URL: micaflow-1.0.3.tar.gz
  • Upload date:
  • Size: 128.0 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.10.19

File hashes

Hashes for micaflow-1.0.3.tar.gz
Algorithm Hash digest
SHA256 f65b7788a96e9ce6db23fe5ce261c2116e2f8b3f585d70640b03dcbc645699cb
MD5 c6152ad1241ff7d5e10db8f5aaef3ac3
BLAKE2b-256 29e85aa7a845e05b0a35f5dca284f2cd27a0bb7fc54c8f6a93deaef1b41f0f1b

See more details on using hashes here.

File details

Details for the file micaflow-1.0.3-cp310-cp312-win_amd64.whl.

File metadata

  • Download URL: micaflow-1.0.3-cp310-cp312-win_amd64.whl
  • Upload date:
  • Size: 139.1 kB
  • Tags: CPython 3.12, Windows x86-64
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.10.19

File hashes

Hashes for micaflow-1.0.3-cp310-cp312-win_amd64.whl
Algorithm Hash digest
SHA256 61caad48d523cb63ac780cc920791511c1e47cd3857064604ee52c8c8c5c6196
MD5 39d88e7a2aad857af244d06eae4bcf1a
BLAKE2b-256 09d72b16ff49037c61de40b6e3a8387ebfa10232fd819148ba052f3ab6a15849

See more details on using hashes here.

File details

Details for the file micaflow-1.0.3-cp310-cp312-manylinux_2_17_x86_64.whl.

File metadata

File hashes

Hashes for micaflow-1.0.3-cp310-cp312-manylinux_2_17_x86_64.whl
Algorithm Hash digest
SHA256 c0c08a54343c1d41d2488c942382a10eb545b9eaa9b295ea979061c5b2a3e91d
MD5 ce6d875e323499da70f9b51073695a4e
BLAKE2b-256 40f79b229428bc173132cb8c5c2c428b1a29d5d89eae06ebc9385201ded686ff

See more details on using hashes here.

File details

Details for the file micaflow-1.0.3-cp310-cp312-macosx_10_9_x86_64.whl.

File metadata

File hashes

Hashes for micaflow-1.0.3-cp310-cp312-macosx_10_9_x86_64.whl
Algorithm Hash digest
SHA256 c9e8f9ecabdd083a731eebd8917870a59bcec6b82630074ea0963d2351858da3
MD5 796b29c8bb6c595dece4d821ab37b4b8
BLAKE2b-256 4ee07e84d51ecfcb429ecdc94c825395550e3192bec4e53a881bb3d8ab0917d1

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page