Skip to main content

A U-Net implementation of the EfficientNetV2.

Project description

License PyPI Python Version CI codecov

A U-Net implementation of the EfficientNetV2.

EfficientV2-UNet

This package is a U-Net implementation of the EfficientNetV2, using TensorFlow.

EfficientNetV2 improves speed and parameter efficiency. This implementation also uses the ImageNet weights for training new models.

It is intended for segmentation of histological images (RGB) that are not saved in pyramidal file format (WSI).

The output segmentation are foreground / background. Multi-class segmentation is not (yet) possible.

It works on TIF images (and probably also PNG).

Installation

  1. Create a python environment (e.g. with conda, python=3.9 and 3.10 work), in a CLI:

    conda create --name myenv python=3.9

  2. Activate environment:

    conda activate myenv

  3. GPU support

    Non GPU installations not tested

    a. GPU support for Windows (example with conda):

    conda install -c conda-forge cudatoolkit=11.2 cudnn=8.1.0

    b. GPU support for Linux -- not tested:

    python3 -m pip install tensorflow[and-cuda]

    c. Apple Silicon support (requires Xcode command-line tools):

    xcode-select --install

    conda install -c apple tensorflow-deps --force-reinstall

  4. Install this library

    • open a CLI, activate your environment (see above))
    • (TensorFlow will be installed for Windows and macOS platforms)

    pip install efficientv2-unet

  5. Verify the GPU-support:

    python -c "import tensorflow as tf; print(tf.config.list_physical_devices('GPU'))"

    or

    python -c "import tensorflow as tf; print(tf.test.is_gpu_available())"

Data preparation

Mask should have background values of 0, and foreground values of 1.

At least 3 image/mask TIF pairs are required to train a model, and should be located in separate folders.

Folder Structure:

├── images
   ├── image1.tif
   ├── image2.tif
   ├── image3.tif
   └── ...
└── masks
   ├── image1.tif
   ├── image2.tif
   ├── image3.tif
   └── ...

Training a model will split the data into train, validation and test images (by default 70%, 15%, 15%, respectively). And the images will be moved to corresponding sub-folders.

Training is performed not on the full images but on tiles (with no overlap), which will be saved into corresponding sub-folders.

Usage

Command-line:

ev2unet --help

# train example:
ev2unet --train --images path/to/images --masks path/to/masks --basedir . --name myUNetName --basemodel b2 --epochs 50

# predict example:
ev2unet --predict --dir path/to/images --model ./models/myUnetName/myUNetName.h5 --resolution 1 --threshold 0.5

Jupyter notebooks

Examples are also available from this repository.

QuPath extension

Get the QuPath extension!

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

efficientv2_unet-0.0.2rc0.tar.gz (8.9 MB view details)

Uploaded Source

Built Distribution

efficientv2_unet-0.0.2rc0-py3-none-any.whl (32.5 kB view details)

Uploaded Python 3

File details

Details for the file efficientv2_unet-0.0.2rc0.tar.gz.

File metadata

  • Download URL: efficientv2_unet-0.0.2rc0.tar.gz
  • Upload date:
  • Size: 8.9 MB
  • Tags: Source
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/5.1.0 CPython/3.12.5

File hashes

Hashes for efficientv2_unet-0.0.2rc0.tar.gz
Algorithm Hash digest
SHA256 e52dac57fa0b4b50819e11f98197f162472ebb31a0a1293a854440f0f0b51768
MD5 ccafa8964c39369a84f068074bcf86bd
BLAKE2b-256 d2a697774fcea1ad2d3e242629798dd70821a3b872bb1e5ecf28c2554cd442ff

See more details on using hashes here.

File details

Details for the file efficientv2_unet-0.0.2rc0-py3-none-any.whl.

File metadata

File hashes

Hashes for efficientv2_unet-0.0.2rc0-py3-none-any.whl
Algorithm Hash digest
SHA256 cda85159ad79ee594bdafc0df22e696ba3080ef240d2d43396c5b54a14d25800
MD5 c053958a151132ec99d1dc6059d95dd0
BLAKE2b-256 4fd7bf59e79c15479c4ee3fcab4b397e703d4d542027af6aeaf0c6a3a400615a

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page