Microstructure Diffusion Toolbox
The Microstructure Diffusion Toolbox, MDT, (formerly known as the Maastricht Diffusion Toolbox) is a framework and library for parallelized (GPU and multi-core CPU) diffusion Magnetic Resonance Imaging (MRI) modeling. MDT’s object oriented and modular design allows arbitrary user specification and combination of biophysical MRI compartment models, diffusion- and T1, T2, T2* based microstructure models, likelihood functions and optimization algorithms. MDT was designed with compatibility in mind and adheres to input, output and variable naming conventions used by other related software tools. Many diffusion and relaxometry microstructure models are included, and new models can be added dynamically. MDT can be extended to other modalities and other parametric models estimated from data volumes varying along controlled parameters (such as b-values, diffusion times, TE, TM, flip angle, etc). The parallelized accelerated computations allow for tens to hundred times faster model fitting, even on standard GPU (and/or CPU) hardware, making MDT ideal for large group studies or population studies.
- HCP pipelines
- Comes with CHARMED, NODDI, BinghamNODDI, NODDIDA, NODDI-DTI, ActiveAx, AxCaliber, Ball&Sticks, Ball&Rackets, Kurtosis, Tensor and relaxometry (T1, T2) models.
- Gaussian, Offset-Gaussian and Rician noise models
- Powell, Levenberg-Marquardt and Nelder-Mead Simplex optimization methods
- Multiple MCMC sampling methods
- GUI, command line and python interfaces
- Dynamically add your own models
- Free Open Source Software: LGPL v3 license
- Python and OpenCL based
- Supports hyperpriors on parameters
- Supports gradient deviations per voxel and per voxel per volume
- Supports volume weighted objective function
- Parallelization over voxels and over volumes
- Runs on Windows, Mac and Linux operating systems
- Runs on Intel, Nvidia and AMD GPU’s and CPU’s.
MDT comes pre-installed with Human Connectome Project (HCP) compatible pipelines for the MGH and the WuMinn 3T studies. To run, after installing MDT, go to the folder where you downloaded your (pre-processed) HCP data (MGH or WuMinn) and execute:
$ mdt-batch-fit . 'NODDI (Cascade)'
and it will autodetect the study in use and fit your selected model to all the subjects.
Quick installation guide
The basic requirements for MDT are:
- Python 3.x
- OpenCL 1.2 (or higher) support in GPU driver or CPU runtime
For Ubuntu >= 16 you can use:
- sudo add-apt-repository ppa:robbert-harms/cbclab
- sudo apt-get update
- sudo apt-get install python3-mdt python3-pip
- sudo pip3 install tatsu
For Debian users and Ubuntu < 16 users, install MDT with:
- sudo apt-get install python3 python3-pip python3-pyopencl python3-numpy python3-nibabel python3-pyqt5 python3-matplotlib python3-yaml python3-argcomplete libpng-dev libfreetype6-dev libxft-dev
- sudo pip3 install mdt
Note that python3-nibabel may need NeuroDebian to be available on your machine. An alternative is to use pip3 install nibabel instead.
The installation on Windows is a little bit more complex and the following is only a quick reference guide. For complete instructions please view the complete documentation.
- Install Anaconda Python 3.*
- Install MOT using the guide at https://mot.readthedocs.io
- Open an Anaconda shell and type: pip install mdt
- Install Anaconda Python 3.*
- Open a terminal and type: pip install mdt
Please note that Mac support is experimental due to the unstable nature of the OpenCL drivers in Mac, that is, users running MDT with the GPU as selected device may experience crashes. Running MDT in the CPU seems to work though.
For more information and full installation instructions see https://mdt_toolbox.readthedocs.org
Release history Release notifications
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
|Filename, size & hash SHA256 hash help||File type||Python version||Upload date|
|mdt-0.15.7-py2.py3-none-any.whl (17.0 MB) Copy SHA256 hash SHA256||Wheel||py2.py3||Oct 19, 2018|
|mdt-0.15.7.tar.gz (16.9 MB) Copy SHA256 hash SHA256||Source||None||Oct 19, 2018|