Skip to main content

Detection and Segmentation Accuracy Measures

Project description

Documentation

After installation, the daccuracy command should be available from a command-line console. The usage help is obtained with daccuracy --help (see Usage Help below).

Input Formats

The ground-truth can be specified through a CSV file, a labeled image, or a labeled Numpy array. The detection can be specified through a labeled image or a labeled Numpy array. A labeled image or Numpy array must have the background labeled with zero, with the objects labeled consecutively from 1.

In CSV format, the ground-truth must be specified as one row per object where n columns (the first n ones by default) correspond to the row, column, and remaining n-2 coordinates of the object center. Note that these coordinates can have floating-point values (as opposed to being restricted to integers). See the usage help below for details.

Example CSV:

1.2, 2.3
3.4, 4.5

This specifies two ground-truth object centers in dimension 2, the first one being at row 1.2 and column 2.3. Alternatively, the center coordinates can be passed in x/y coordinate system. See the usage help below for details.

Accuracy Measures

The following accuracy measures are computed:

  • Number of ground-truth objects

  • Number of detected objects

  • Number of true positives, false positives, and false negatives

  • Precision, recall, and F1 score

  • Free-response Receiver Operating Characteristic (FROC) curve sample: named froc_sample and corresponding to the tuple (false positives, true positive rate)

  • Values for measure correctness checking: check_tp_fn_equal_gt (true_positives + false_negatives ?=? ground-truths) and check_tp_fp_equal_dn (true_positives + false_positives ?=? detections)

Additionally, if the ground-truth has been passed as an image or a Numpy array, the mean, standard deviation, minimum, and maximum of the following measures are also computed:

  • Ground-truth/detection overlap (as a percentage with respect to the smaller region among ground-truth and detection)

  • Ground-truth/detection Jaccard index

  • Pixel-wise precision, recall, and F1 score

Output Formats

See usage help below.

Usage Help (daccuracy --help)

Usage Help:

usage: daccuracy [-h] --gt ground_truth --dn detection [--shifts Dn_shift Dn_shift] [-e] [-t TOLERANCE] [-f {csv,nev}]
                 [-o Output file] [-s]

3 modes:
    - one-to-one: one ground-truth (csv, image, or Numpy array) vs. one detection (image or Numpy array)
    - one-to-many: one ground-truth vs. several detections (folder of detections)
    - many-to-many: several ground-truths (folder of ground-truths) vs. corresponding detections (folder of detections)

In many-to-many mode, each detection file must have a counterpart ground-truth file with the same name, but not
necessarily the same extension.

With 8-bit image formats, ground-truth and detection cannot contain more than 255 objects. If they do, they could be
saved using higher-depth formats. However, it is recommended to save them in NPY or NPZ Numpy formats instead.

optional arguments:
  -h, --help            show this help message and exit
  --gt ground_truth     Ground-truth CSV file of centers or labeled image or labeled Numpy array, or ground-truth folder;
                        If CSV, --rAcB (or --xAyB) can be passed additionally to indicate that columns A and B contain
                        the centers' rows and cols, respectively (or x's and y's in x/y mode). Columns must be specified
                        as (possibly sequences of) uppercase letters, as is usual in spreadsheet applications. For
                        ground-truths of dimension "n" higher than 2, the symbol "+" must be used for the remaining
                        "n-2" dimensions. For example, --rAcB+C+D in dimension 4.
  --relabel-gt {seq,full}
                        If present, this option instructs to relabel the ground-truth
                        sequentially.
  --dn detection        Detection labeled image or labeled Numpy array, or detection folder.
  --relabel-gt {seq,full}
                        If present, this option instructs to relabel the ground-truth
                        sequentially.
  --shifts Dn_shift [Dn_shift ...]
                        Vertical (row), horizontal (col), and higher dimension shifts to apply to detection. Default:
                        all zeroes.
  -e, --exclude-border  If present, this option instructs to discard objects touching image border, both in ground-truth
                        and detection.
  -t TOLERANCE, --tol TOLERANCE, --tolerance TOLERANCE
                        Max ground-truth-to-detection distance to count as a hit (meant to be used when ground-truth is
                        a CSV file of centers). Default: zero.
  -f {csv,nev}, --format {csv,nev}
                        nev: one "Name = Value"-row per measure; csv: one CSV-row per ground-truth/detection pairs.
                        Default: "nev".
  -o Output file        CSV file to store the computed measures or "-" for console output. Default: console output.
  -s, --show-image      If present, this option instructs to show an image superimposing ground-truth onto detection.
                        It is actually done only for 2-dimensional images.

Installation

This project is published on the Python Package Index (PyPI) at: https://pypi.org/project/daccuracy/. It should be installable from Python distribution platforms or Integrated Development Environments (IDEs). Otherwise, it can be installed from a command console using pip:

For all users (after acquiring administrative rights)

For the current user (no administrative rights required)

Installation

pip install daccuracy

pip install --user daccuracy

Update

pip install --upgrade daccuracy

pip install --user --upgrade daccuracy

Dependencies

The development relies on several packages:

  • Mandatory: matplotlib, numpy, scikit-image, scipy

  • Optional: None

The mandatory dependencies, if any, are installed automatically by pip, if they are not already, as part of the installation of DAccuracy. Python distribution platforms or Integrated Development Environments (IDEs) should also take care of this. The optional dependencies, if any, must be installed independently by following the related instructions, for added functionalities of DAccuracy.

Brief Description

DAccuracy (Detection Accuracy) allows to compute

  • some accuracy measures

  • on an N-dimensional detection or segmentation image

  • when the ground-truth is available as a CSV file, an image, or a Numpy file.

It works in 3 contexts:

  • one-to-one: single ground-truth, single detection image;

  • one-to-many: unique ground-truth, several detection images (typically obtained by various methods);

  • many-to-many: set of “(ground-truth, detection image)” pairs.

Example console output (accuracy measures can also be written to a CSV file):

        Ground truth = ground-truth.csv
           Detection = detection.png
     N ground truths = 55
        N detections = 47
       True_positive = 43
      False_positive = 4
      False_negative = 12
           Precision = 0.9148936170212766
              Recall = 0.7818181818181819
            F1_score = 0.8431372549019609
         Froc_sample = (4, 0.7818181818181819)
Check_tp_fn_equal_gt = 55
Check_tp_fp_equal_dn = 47

Acknowledgments

https://img.shields.io/badge/code%20style-black-000000.svg https://img.shields.io/badge/%20imports-isort-%231674b1?style=flat&labelColor=ef8336

The project is developed with PyCharm Community.

The code is formatted by Black, The Uncompromising Code Formatter.

The imports are ordered by isortyour imports, so you don’t have to.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

daccuracy-2024.1.tar.gz (36.4 kB view details)

Uploaded Source

Built Distribution

daccuracy-2024.1-py3-none-any.whl (33.4 kB view details)

Uploaded Python 3

File details

Details for the file daccuracy-2024.1.tar.gz.

File metadata

  • Download URL: daccuracy-2024.1.tar.gz
  • Upload date:
  • Size: 36.4 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/5.1.1 CPython/3.12.7

File hashes

Hashes for daccuracy-2024.1.tar.gz
Algorithm Hash digest
SHA256 46e5fafce250cdef1d0a0afe22c1ce64708c701b6cc514f70d8cdf2066b5c3b5
MD5 bfe5842510d9520e23b219f383d6c64e
BLAKE2b-256 138dbd9664c628d0a337a221b9560db16e698312e842c0267468c3a5ee5ad9d4

See more details on using hashes here.

File details

Details for the file daccuracy-2024.1-py3-none-any.whl.

File metadata

  • Download URL: daccuracy-2024.1-py3-none-any.whl
  • Upload date:
  • Size: 33.4 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/5.1.1 CPython/3.12.7

File hashes

Hashes for daccuracy-2024.1-py3-none-any.whl
Algorithm Hash digest
SHA256 7e1b5ff0308183a7e2608f33e221319031d9de90726f0dcb78e365db891e67f1
MD5 bf3c78664a1df5fd5af4f7b290f16391
BLAKE2b-256 a6c5f34c403e412e98d27abc12a107d0c63b0c379051fa3162eacd05c5319986

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page