Deep learning classification with clinica
Project description
Clinica Deep Learning (clinicadl)
for Alzheimer's Disease
About the project
This repository hosts the code source for reproducible experiments on automatic classification of Alzheimer's disease (AD) using anatomical MRI data. It allows to train convolutional neural networks (CNN) models. The journal version of the paper describing this work is available here.
Automatic classification of AD using a classical machine learning approach can be performed using the software available here: https://github.com/aramis-lab/AD-ML.
Disclaimer: this software is in going-on development. Some features can change between different commits. A stable version is planned to be released soon. The release v.0.0.1 corresponds to the date of submission of the publication but in the meanwhile important changes are being done to facilitate the use of the package.
If you find a problem when use it or if you want to provide us feedback, please open an issue.
Getting Started
Full instructions for installation and additional information can be found in the user documentation.
ClinicaDL currently supports macOS and Linux.
We recommend to use conda for the installation of ClinicaDL.
It guaranties the right management of libraries depending on common packages:
conda create --name ClinicaDL python=3.6 pytorch torchvision -c pytorch
conda activate ClinicaDL
git clone git@github.com:aramis-lab/AD-DL.git
cd AD-DL
pip install -r requirements.txt
Once done, install the package clinicadl as developer in the active conda environment:
cd clinicadl
pip install -e .
Overview
How to use clinicadl ?
clinicadl is an utility to be used with the command line.
There are six kind of tasks that can be performed using the command line:
-
Process TSV files.
tsvtoolincludes many functions to get labels from BIDS, perform k-fold or single splits, produce demographic analysis of extracted labels and reproduce the restrictions made on AIBL and OASIS in the original paper. -
Generate a synthetic dataset. The
generatetask is useful to obtain synthetic datasets frequently used in functional tests. -
T1w-weighted images preprocessing. The
preprocessingtask processes a dataset of T1 images stored in BIDS format and prepares to extract the tensors (see paper for details on the preprocessing). Output is stored using the CAPS hierarchy. -
T1 MRI tensor extraction. The
extracttask allows to create files in PyTorch format (.pt) with different options: the complete MRI, 2D slices and/or 3D patches. This files are also stored in the CAPS hierarchy. -
Train neural networks. The
traintask is designed to perform training of CNN models using different kind of inputs, e.g., a full MRI (3D-image), patches from a MRI (3D-patch), specific regions of a MRI (ROI-based) or slices extracted from the MRI (2D-slices). Parameters used during the training are configurable. This task allow also to train autoencoders. -
MRI classification. The
classifytask uses previously trained models to perform the inference of a particular or a set of MRI.
For detailed instructions and options of each task type clinica 'task' -h.
Testing
Be sure to have the pytest library in order to run the test suite. This test
suite includes unit testing to be launched using the command line.
Unit testing (WIP)
The CLI (command line interface) part is tested using pytest. We are planning
to provide unit tests for the other tasks in the future. If you want to run
successfully the tests maybe you can use a command like this one:
pytest clinicadl/tests/test_cli.py
Functional testing
Training task are tested using synthetic data created from MRI extracted of the OASIS dataset. To run them, go to the test folder and type the following command in the terminal:
pytest ./test_train_cnn.py
Please, be sure to previously create the right dataset.
Model prediction tests
For sanity check trivial datasets can be generated to train or test/validate the predictive models.
The follow command allow you to generate two kinds of synthetic datasets: fully separable (trivial) or intractable data (IRM with random noise added).
clinicadl generate {random,trivial} caps_directory tsv_path output_directory
--n_subjects N_SUBJECTS
The intractable dataset will be made of noisy versions of the first image of
the tsv file given at
tsv_path associated to random labels.
The trivial dataset includes two labels:
- AD corresponding to images with the left half of the brain with lower intensities,
- CN corresponding to images with the right half of the brain with lower intensities.
Pretrained models
Some of the pretained model for the CNN networks can be obtained here: https://zenodo.org/record/3491003
These models were obtained during the experiments for publication. Updated versions of the models will be published soon.
Bibliography
All the papers described in the State of the art section of the manuscript may be found at this URL address: https://www.zotero.org/groups/2337160/ad-dl.
Related Repositories
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file clinicadl-0.0.2b2.tar.gz.
File metadata
- Download URL: clinicadl-0.0.2b2.tar.gz
- Upload date:
- Size: 75.6 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/1.13.0 pkginfo/1.5.0.1 requests/2.23.0 setuptools/46.1.3.post20200325 requests-toolbelt/0.9.1 tqdm/4.45.0 CPython/3.6.7
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
2d41dee4f298bbdf0ddc9620a0c51d1faf5ac2bc6e351111d0ece54c54e33c4b
|
|
| MD5 |
dbd995e940d3274793886d9d0a041ecf
|
|
| BLAKE2b-256 |
13c3f0cb988546a871330b7c363552d6104326af4574e8c35645bdf45b1c5eda
|
File details
Details for the file clinicadl-0.0.2b2-py2.py3-none-any.whl.
File metadata
- Download URL: clinicadl-0.0.2b2-py2.py3-none-any.whl
- Upload date:
- Size: 103.8 kB
- Tags: Python 2, Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/1.13.0 pkginfo/1.5.0.1 requests/2.23.0 setuptools/46.1.3.post20200325 requests-toolbelt/0.9.1 tqdm/4.45.0 CPython/3.6.7
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
42d9c3474b0b807f0f9c214a6e5e8c7cdd4228b1b207b61cc4e784da6154711b
|
|
| MD5 |
3b1f962d4a78da2e9e6a478454876273
|
|
| BLAKE2b-256 |
656078ff4b3b981e2715aea5921cc438ee6fb1ea860578d91b84e4ebafd4c9c4
|