Skip to main content

No project description provided

Project description

PyALFE

Python implementation of Automated Lesion and Feature Extraction (ALFE) pipeline. We developed this pipeline for analysis of brain MRIs of patients suffering from conditions that cause brain lesions. It utilizes image processing tools, image registration tools, and deep learning segmentation models to produce a set of features that describe the lesion in the brain.

Requirements

PyALFE supports Linux x86-64, Mac x86-64, and Mac arm64 and requires python 3.9+.

Image registration and processing

PyALFE can be configured to use either Greedy or AntsPy registration tools. Similarly, PyALFE can can be configured to use Convert3D or python native library Nilearn for image processing tasks. To use Greedy and Convert3d, these command line tools should be installed on your system.

Installation

Clone the repo

git clone https://github.com/reghbali/pyalfe.git
cd pyalfe

Then run (we recommend using a python virtual environment)

pip install --upgrade pip

You can either install pyalfe in development mode or build and install.

Option 1: Development mode installation

First update the setuptools

pip install --upgrade setuptools

Run the following command in the parent pyalfe directory:

pip install -e .

Option 2: Build and install

First update the build tool

pip install --upgrade build

Run the following commands in the parent pyalfe directory to build the whl file and install pyalfe

python -m build
pip install dist/pyalfe-0.1.0-py3-none-any.whl

Download models

To download deep learning models, run

pyalfe download models

Pyradiomics support

To install pyalfe with pyradiomics support, run

pip install -e  '.[radiomics]'

for development installation or

pip install 'dist/pyalfe-0.0.1-py3-none-any.whl[radiomics]'

when performing a build and install.

Usage

Configuration

To configrue the PyALFE pipeline you should run:

pyalfe configure

which prompt the you to enter the following required configurations:

Input directory

Enter input image directory: /path/to/my_mri_data

The input directory (input_dir) contains the images that will be processed by PyALFE and should be organized by accessions (or session ids). Inside the directory for each accession there should be a directory for each available modality. Here is an example that follows ALFE default structure:

my_mri_data
│
│───12345
│   │
│   │───T1
│   │   └── T1.nii.gz
│   │───T1Post
│   │   └── T1Post.nii.gz
│   │───FLAIR
│   │   └── FLAIR.nii.gz
│   │───ADC
│   │   └── ADC.nii.gz
│   │───T2
│   │   └── T2.nii.gz
│   └───CBF
│       └── CBF.nii.gz
└───12356
.   │
.   │───T1
.   │   └── T1.nii.gz
    │───T1Post
    │   └── T1Post.nii.gz
    │───FLAIR
    │   └── FLAIR.nii.gz
    │───ADC
    │   └── ADC.nii.gz
    └───T2
        └── T2.nii.gz

To use this directory the user should provide path/to/my_mri_data as the input directory. This config value can be overwritten when calling pyalfe run via -id or --input-dir option.

pyALFE also supports BIDS directories. Here is an example of input dir organized in BIDS format:

my_mri_data
│
│───sub-01
│   │───anat
│   │   │───sub-01_T1w.nii.gz
│   │   │───sub-01_ce-gadolinium_T1w.nii.gz
│   │   │───sub-01_T2w.nii.gz
│   │   └───sub-01_FLAIR.nii.gz
│   │───dwi
│   │    │───sub-01_dwi.nii.gz
│   │    └───sub-01_md.nii.gz
│   │───swi
│   │    └───sub-01_swi.nii.gz
│   └───perf
│       └───sub-01_cbf.nii.gz
│
└───sub-02
.   │───anat
.   │   │───sub-02_T1w.nii.gz
.   │   │───sub-02_ce-gadolinium_T1w.nii.gz
    │   │───sub-02_T2w.nii.gz
    │   └───sub-02_FLAIR.nii.gz
    │───dwi
    │    │───sub-02_dwi.nii.gz
    │    └───sub-02_md.nii.gz
    │───swi
    │    └───sub-02_swi.nii.gz
    └───perf
        └───sub-02_cbf.nii.gz

Output directory

Enter output image directory: /path/to/output_dir

The output image directory (output_dir) is where pyALFE writes all its output to. It can be any valid path in filesystem that user have write access to. This config value can be overwritten when calling pyalfe run via -od or --output-dir option.

Modalities

Enter modalities separated by comma [T1,T1Post,FLAIR,T2,ADC]: T1,T1Post,ADC

All the modalities that should be processed by ALFE. Modalities should be separated by comma. To use the default value of T1,T1Post,T2,FLAIR,ADC, simply press enter. This config value can be overwritten when calling pyalfe run via -m or --modalities option.

Target modalities

Enter target modalities separated by comma [T1Post,FLAIR]:

The target modalities are used to define the abnormalities which are then used to extract features. Currently, only T1Post, FLAIR, or both (default) can be target modality. This config value can be overwritten when calling pyalfe run via -t or --targets option.

Dominant Tissue

Enter the dominant tissue for the lesions (white_matter, gray_matter, auto) [white_matter]:

The dominant tissue where the tumor or lesion is expected to be located at. This information is use in relative signal feature calculations. If you choose auto, pyalfe automatically detect the dominant tissue after segmentation. This config value can be overwritten when calling pyalfe run via -dt or --dominant_tissue option.

Image processor

image processor to use (c3d, nilearn) [c3d]:

Currently, pyalfe can be configures to use either Convert3D (a.k.a. c3d) or Nilearn for image processing tasks. The default is Convert3d aka c3d. In other to use c3d, you have to download it using the download command. To use Nilearn, you do not need to run any extra command since it is already installed when you install pyalfe. This config value can be overwritten when calling pyalfe run via -ip or --image_processing option.

Image Registration

image registration to use (greedy, ants) [greedy]:

Currently, pyalfe can be configures to use either greedy or ants for image registration tasks. The default is greedy. In other to use greedy, you have to download it using the download command. To use ants, install pyalfe with ants support pip install pyalfe[ants]. This config value can be overwritten when calling pyalfe run via -ir or --image-registration option.

Dierctory Data Structure

data directory structure (press enter for default) (alfe, bids) [alfe]:

The directory structure that pyALFE expects in the input directory and will follow when creating the output. See Inupt directory for information on ALFE and BIDS. This config value can be overwritten when calling payalfe run via -dds or --data-dir-structure option.

Running the pipeline

To run PyALFE for an accession

pyalfe run ACCESSION

If you chose to save the configuration file in a non-standard location you can run

pyalfe run -c path/to/config.ini ACCESSION

In general, all the config option can be overwritten by command line options. To see a list of command line options, run:

pyalfe run --help

Contributing

Pull requests are welcome. For major changes, please open an issue first to discuss what you would like to change.

Please make sure to update tests as appropriate.

License

BSD 3-Clause

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

pyalfe-0.1.0.tar.gz (37.1 kB view details)

Uploaded Source

Built Distribution

pyalfe-0.1.0-py3-none-any.whl (42.8 kB view details)

Uploaded Python 3

File details

Details for the file pyalfe-0.1.0.tar.gz.

File metadata

  • Download URL: pyalfe-0.1.0.tar.gz
  • Upload date:
  • Size: 37.1 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/5.1.0 CPython/3.12.4

File hashes

Hashes for pyalfe-0.1.0.tar.gz
Algorithm Hash digest
SHA256 071986225f0a056cb7114c2dab07cc90337abf58577d77f3f43faba883b1c432
MD5 c2cf133201fbfa8e7ac7529d16d3efce
BLAKE2b-256 49fd482d8639875156002af5cab5a82ae7eee92e75704cc8eeecbb2c9681cb82

See more details on using hashes here.

File details

Details for the file pyalfe-0.1.0-py3-none-any.whl.

File metadata

  • Download URL: pyalfe-0.1.0-py3-none-any.whl
  • Upload date:
  • Size: 42.8 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/5.1.0 CPython/3.12.4

File hashes

Hashes for pyalfe-0.1.0-py3-none-any.whl
Algorithm Hash digest
SHA256 1e16bf9010a8ae57bc1cf55e5441b38f2217b3c9d3456b6e0a63414eda61944e
MD5 acc3ea0f69415721169c88ae03369afc
BLAKE2b-256 dc9593a771de946cd23815f5eaf1389d2600c2e260580cda4a903985a9c31563

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page