AugLab investigates the influence of different data augmentation strategies on MRI training performance.
Project description
AugLab
This repository investigates the influence of different data augmentation strategies on MRI training performance.
What is available ?
This repository contains:
- A nnUNet trainer with extensive data augmentations
- A basic Monai segmentation script incorporating data augmentations
- A script generating augmentations from input images and segmentations
How to install ?
-
Open a
bashterminal in the directory where you want to work. -
Create and activate a virtual environment using python >=3.10 (highly recommended):
- venv
python3 -m venv venv source venv/bin/activate
- conda env
conda create -n myenv python=3.10 conda activate myenv -
Clone this repository:
- Git clone
git clone git@github.com:neuropoly/AugLab.git cd AugLab
-
Install AugLab using one of the following commands:
Note: If you pull a new version from GitHub, make sure to rerun this command with the flag
--upgrade- nnunetv2 only usage
python3 -m pip install -e .[nnunetv2]
- full usage (with Monai and other dependencies)
python3 -m pip install -e .[all]
-
Install PyTorch following the instructions on their website. Be sure to add the
--upgradeflag to your installation command to replace any existing PyTorch installation. Example:
python3 -m pip install torch==2.5.1 torchvision==0.20.1 torchaudio==2.5.1 --index-url https://download.pytorch.org/whl/cu118 --upgrade
Run nnunet training with AugLab trainer
To use the AugLab trainer with nnUNet, first add the trainer to your nnUNet installation by running:
auglab_add_nnunettrainer --trainer nnUNetTrainerDAExt
Then, when you run nnUNet training as usual, specifying the AugLab trainer, for example:
nnUNetv2_train 100 3d_fullres 0 -tr nnUNetTrainerDAExtGPU -p nnUNetPlans
You can also specify your data augmentation parameters by providing a JSON file using the environment variable AUGLAB_PARAMS_GPU_JSON:
Note: By default auglab/configs/transform_params_gpu.json is used if no file is specified.
AUGLAB_PARAMS_GPU_JSON=/path/to/your/params.json nnUNetv2_train 100 3d_fullres 0 -tr nnUNetTrainerDAExtGPU -p nnUNetPlans
⚠️ Warning : To avoid any paths issues, please specify an absolute path to your JSON file.
How to use my data ?
Scripts developped in this repository use JSON files to specify image and segmentation paths: see this example.
How do I specify my parameters ?
To track parameters used during data augmentation, JSON files are also used: see this example
Project details
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file auglab-20260109.tar.gz.
File metadata
- Download URL: auglab-20260109.tar.gz
- Upload date:
- Size: 62.7 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.2.0 CPython/3.10.19
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
e9601ff977de852a2463dffeec6dbef354f70f9dfadbfaf1c40f8b393489080f
|
|
| MD5 |
2844aa99b55a9c229370e48e982a27b5
|
|
| BLAKE2b-256 |
6b67b2b2980c9b9dfbbed76229da58abf709e72c305e9794844066ee735f5ff9
|
File details
Details for the file auglab-20260109-py3-none-any.whl.
File metadata
- Download URL: auglab-20260109-py3-none-any.whl
- Upload date:
- Size: 69.7 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.2.0 CPython/3.10.19
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
0c35733eca7d010530957e6cee3209625c2fdbdd38e0584695f1e0cec225ea9c
|
|
| MD5 |
9f07fa154ba8f3e07b848f550342cc6d
|
|
| BLAKE2b-256 |
cac313cd9e80e1891527a09603e9ca9a9d2baa65ae9e9981bd65dee55afdc514
|