No project description provided
Project description
Deep Learning for Higher Harmonic Generation Microscopy
Deep learning utilities for higher harmonic generation microscopy images.
This project is a deep learning application to classify various pediatric brain tumours from higher harmonic generation microscopy images.
Explore the docs »
Report Bug
·
Request Feature
Table of Contents
About The Project
The project aims to do deep learning classification on higher harmonic generation (HHG) microscopy images of pediatric brain tumours.
Built With
Getting Started
This section includes instructions on setting up the project locally.
Prerequisites
Conda
For package management, it is advised to use a conda package manager. The author recommends Miniforge or Mambaforge.
vips
This project depends on dlup (automatically installed), which depends on vips. On Windows, vips needs to be installed locally. Download the latest libvips Windows binary and unzip somewhere. On Linux/macOS, vips is included with the installation steps below.
OpenSlide
Vips comes with OpenSlide. It is not needed to install OpenSlide separately.
CUDA
To do deep learning on CUDA enabled accelerators, follow installation instructions on pytorch.org.
Run nvidia-smi
to see if CUDA is available.
Installation
Run the following commands from a conda enabled shell (such as Miniforge Prompt, if Miniforge/Mambaforge is installed).
- Clone this repository and change directories
git clone https://github.com/siemdejong/dpat.git dpat && cd dpat
- Create a new conda environment and activate it.
conda create -n <env_name> conda activate <env_name>
- Install dependencies from
environment.yml
.conda env update -f environment.yml
- Windows only: if you use this library for use in scripts, make sure libvips is available, see Prerequisites#libvips.
If using this library in a script, make sure to properly install the package with
import dpat dpat.install_windows("path/to/vips/bin")
- If you use this library for deep learning and want to use CUDA-enabled Pytorch, follow instructions on pytorch.org. Make sure CUDA is available, see Prerequisites#CUDA.
- Install dpat in editable mode with
pip install -e .
- Verify installation
python -c "import dpat"
Usage
Converting images
To convert all images from directory INPUT_DIR, and output the images as TIFF in OUTPUT_DIR, run
dpat convert batch -i INPUT_DIR -o OUTPUT_DIR -e tiff
Large images need to be trusted against decompression bomb DOS attack.
Use the --trust
flag.
To skip images that were already converted to the target extension, use --skip-existing
.
NOTE: If converting to tiff, the input images are assumed to contain the reference to the scanning program, which must be in {200slow, 300slow, 300fast}.
Usage: dpat convert batch [OPTIONS]
Options:
-i, --input-dir TEXT Input directory where to find the images to be
converted. [default: .]
-o, --output-dir TEXT Output directory where place converted files.
[default: ./converted]
-e, --output-ext [tiff|tif] Extension to convert to. [required]
-w, --num-workers INTEGER Number of workers that convert the images in
parallel. [default: 4]
-c, --chunks INTEGER Number of chunks distributed to every worker.
[default: 30]
--trust Trust the source of the images.
--skip-existing Skip existing output files.
--help Show this message and exit.
Creating splits
To create train-val-test splits linking paths of images to splits with IMAGE_DIR, output the splits to OUTPUT_DIR, with labels PATH_TO_LABELS_FILE, and dataset name NAME run
dpat splits create -i IMAGE_DIR -o OUTPUT_DIR -l PATH_TO_LABELS_FILE -n NAME
To filter diagnoses that exactly match diseases, use e.g. -f medulloblastoma -f "pilocytic astrocytoma"
.
To filter filenames that match certain values, use a glob pattern.
E.g. -y *slow.tiff
to only include images ending with slow.tiff
.
To exclude filenames that match certaine values, use a glob pattern with -x
.
Exclusion is performed on the set specified by inclusion.
Usage: dpat splits create [OPTIONS]
Options:
-i, --input-dir TEXT Input directory where to find the images. [required]
-l, --labels TEXT Path to labels file. [required]
-n, --name TEXT Name of dataset. [required]
-o, --output-dir TEXT Directory where to put the splits. [default: splits]
--overwrite Overwrite folds in output dir, if available.
-y, --include TEXT Glob pattern to include files from `input-dir`
[default: *.*]
-x, --exclude TEXT Glob pattern to exclue files from `input-dir`,
included with `--include`
-f, --filter TEXT Filter a diagnosis. For multiple diagnoses, use `-f 1
-f 2`.
--help Show this message and exit.
Logging
When using the package as a library, if needed, logging can be turned off with
logging.getLogger('dpat').propagate = False
Contributing
Contribute using the following steps.
- Fork the Project
- Create your Feature Branch (
git checkout -b feature/AmazingFeature
) - Commit your Changes (
git commit -m 'Add some AmazingFeature'
) - Push to the Branch (
git push origin feature/AmazingFeature
) - Open a Pull Request
License
Distributed under the GNU General Public License v3.0. See LICENSE
for more information.
Contact
Siem de Jong - linkedin.com/in/siemdejong - siem.dejong@hotmail.nl
Project Link: https://github.com/siemdejong/dpat
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
File details
Details for the file dpat-2.0.0.tar.gz
.
File metadata
- Download URL: dpat-2.0.0.tar.gz
- Upload date:
- Size: 38.1 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/3.8.0 pkginfo/1.9.6 readme-renderer/37.3 requests/2.28.2 requests-toolbelt/0.10.1 urllib3/1.26.14 tqdm/4.64.1 importlib-metadata/6.0.0 keyring/23.13.1 rfc3986/1.5.0 colorama/0.4.6 CPython/3.9.16
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 | 2c014bad80130fafb95bc96ec1d57a365672ceb058640fec31358efa9a97a314 |
|
MD5 | 05cb0bae0bbbf4a8e93c34c9339e37a5 |
|
BLAKE2b-256 | 7af65377652ad4a81f4424b2e97eee188ed5ee6b6fe55506cdc02ec68988406f |
File details
Details for the file dpat-2.0.0-py3-none-any.whl
.
File metadata
- Download URL: dpat-2.0.0-py3-none-any.whl
- Upload date:
- Size: 43.1 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/3.8.0 pkginfo/1.9.6 readme-renderer/37.3 requests/2.28.2 requests-toolbelt/0.10.1 urllib3/1.26.14 tqdm/4.64.1 importlib-metadata/6.0.0 keyring/23.13.1 rfc3986/1.5.0 colorama/0.4.6 CPython/3.9.16
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 | 788d7ae238c6bbae15a2a4c6dc8d966750098a19980d579b237f2b46ae950ca9 |
|
MD5 | c76bef557299bb1ad9a6e5cb284d8b73 |
|
BLAKE2b-256 | 8df064798bfbdf5494fd9570ccc9eb5e55064182ef83b242e1d6ff53b3b8e8f4 |