Skip to main content

Dafne - Deep Anatomical Federated Network

Project description

PyPI version PDF Documentation HTML Documentation

Dafne

Deep Anatomical Federated Network is a program for the segmentation of medical images. It relies on a server to provide deep learning models to aid the segmentation, and incremental learning is used to improve the performance. See https://www.dafne.network/ for documentation and user information.

Windows binary installation

Please install the Visual Studio Redistributable Package under windows: https://aka.ms/vs/16/release/vc_redist.x64.exe Then, run the provided installer

Mac binary installation

Install the Dafne App from the downloaded .dmg file as usual. Make sure to download the archive appropriate for your architecture (x86 or arm).

Linux binary installation

The Linux distribution is a self-contained executable file. Simply download it, make it executable, and run it.

pip installation

Dafne can also be installed with pip pip install dafne

Citing

If you are writing a scientific paper, and you used Dafne for your data evaluation, please cite the following paper:

Santini F, Wasserthal J, Agosti A, et al. Deep Anatomical Federated Network (Dafne): an open client/server framework for the continuous collaborative improvement of deep-learning-based medical image segmentation. 2023 doi: 10.48550/arXiv.2302.06352.

Notes for developers

dafne

Run: python dafne.py <path_to_dicom_img>

Notes for the DL models

Apply functions

The input of the apply function is:

dict({
    'image': np.array (2D image)
    'resolution': sequence with two elements (image resolution in mm)
    'split_laterality': True/False (indicates whether the ROIs should be split in L/R if applicable)
    'classification': str - The classification tag of the image (optional, to identify model variants)
})

The output of the classifier is a string. The output of the segmenters is:

dict({
    roi_name_1: np.array (2D mask),
    roi_name_2: ...
})

Incremental learn functions

The input of the incremental learn functions are:

training data: dict({
    'resolution': sequence (see above)
    'classification': str (see above)
    'image_list': list([
        - np.array (2D image)
        - np.array (2D image)
        - ...
    ])
})

training outputs: list([
    - dict({
        roi_name_1: np.array (2D mask)
        roi_name_2: ...
    })
    - dict...

Every entry in the training outputs list corresponds to an entry in the image_list inside the training data. So len(training_data['image_list']) == len(training_outputs).

Acknowledgments

Input/Output is based on DOSMA - GPLv3 license

This software includes the Segment Anything Model (SAM) - Apache 2.0 license

Other packages required for this project are listed in requirements.txt

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

dafne-1.8a3.tar.gz (602.0 kB view details)

Uploaded Source

Built Distribution

dafne-1.8a3-py3-none-any.whl (653.1 kB view details)

Uploaded Python 3

File details

Details for the file dafne-1.8a3.tar.gz.

File metadata

  • Download URL: dafne-1.8a3.tar.gz
  • Upload date:
  • Size: 602.0 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/4.0.2 CPython/3.10.4

File hashes

Hashes for dafne-1.8a3.tar.gz
Algorithm Hash digest
SHA256 97fca6225064b5c3b82caab51b899b79208b44399b8931f52998c01e25c89670
MD5 a6e3142c8bf0dffcf8aab8ead925624c
BLAKE2b-256 f68abaa9740907688690a38c6dfbebb54a88b0894c68c0e4719c94c8fc75e2b9

See more details on using hashes here.

File details

Details for the file dafne-1.8a3-py3-none-any.whl.

File metadata

  • Download URL: dafne-1.8a3-py3-none-any.whl
  • Upload date:
  • Size: 653.1 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/4.0.2 CPython/3.10.4

File hashes

Hashes for dafne-1.8a3-py3-none-any.whl
Algorithm Hash digest
SHA256 f5f41a3de60c4b910611083893cc0f3f99173d13857a4ec9066c0f8844ebd0e7
MD5 9aec73ca344e46252fee0cb5025984d3
BLAKE2b-256 24667da73b006ff2e4e1de2972051e971c9719ee8f922f7c49a02efd316729cc

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page