Skip to main content

Python package for evaluating neuron segmentations in terms of the number of splits and merges

Project description

segmentation-skeleton-metrics

License Code Style

semantic-release: angular

Python package for performing a skeleton-based evaluation of a predicted segmentation of neural arbors. This tool detects topological mistakes (i.e. splits and merges) in the predicted segmentation by comparing the ground truth skeleton to it. Once this comparison is complete, several statistics (e.g. edge accuracy, split count, merge count) are computed and returned in a dictionary.

Usage

Here is a simple example of evaluating a predicted segmentation. Note that this package supports a number of different input types, see documentation for details.

import os

from aind_segmentation_evaluation.evaluate import run_evaluation
from aind_segmentation_evaluation.conversions import volume_to_graph
from tifffile import imread


if __name__ == "__main__":

    # Initializations
    data_dir = "./resources"
    target_graphs_dir = os.path.join(data_dir, "target_graphs")
    path_to_target_labels = os.path.join(data_dir, "target_labels.tif")
    pred_labels = imread(os.path.join(data_dir, "pred_labels.tif"))
    pred_graphs = volume_to_graph(pred_labels)

    # Evaluation
    stats = run_evaluation(
        target_graphs_dir,
        path_to_target_labels,
        pred_graphs,
        pred_labels,
        filetype="tif",
        output="tif",
        output_dir=data_dir,
        permute=[2, 1, 0],
        scale=[1.101, 1.101, 1.101],
    )

    # Write out results
    print("Graph-based evaluation...")
    for key in stats.keys():
        print("   {}: {}".format(key, stats[key])

Installation

To use the software, in the root directory, run

pip install -e .

To develop the code, run

pip install -e .[dev]

To install this package from PyPI, run

pip install aind-segmentation-evaluation

Pull requests

For internal members, please create a branch. For external members, please fork the repository and open a pull request from the fork. We'll primarily use Angular style for commit messages. Roughly, they should follow the pattern:

<type>(<scope>): <short summary>

where scope (optional) describes the packages affected by the code changes and type (mandatory) is one of:

  • build: Changes that affect build tools or external dependencies (example scopes: pyproject.toml, setup.py)
  • ci: Changes to our CI configuration files and scripts (examples: .github/workflows/ci.yml)
  • docs: Documentation only changes
  • feat: A new feature
  • fix: A bugfix
  • perf: A code change that improves performance
  • refactor: A code change that neither fixes a bug nor adds a feature
  • test: Adding missing tests or correcting existing tests

Project details


Release history Release notifications | RSS feed

Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

segmentation_skeleton_metrics-3.0.5.tar.gz (117.9 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

segmentation_skeleton_metrics-3.0.5-py3-none-any.whl (17.7 kB view details)

Uploaded Python 3

File details

Details for the file segmentation_skeleton_metrics-3.0.5.tar.gz.

File metadata

File hashes

Hashes for segmentation_skeleton_metrics-3.0.5.tar.gz
Algorithm Hash digest
SHA256 417c16c25c270292049130d44d7713b749f67508f3614e5c47fa9249424b5449
MD5 85dd17d28cfe0b6d3d1c09469417428f
BLAKE2b-256 0cec2fdd47f13218fcffc68f9fed832b3090fda26b5d29337f1c2c83a92564d5

See more details on using hashes here.

File details

Details for the file segmentation_skeleton_metrics-3.0.5-py3-none-any.whl.

File metadata

File hashes

Hashes for segmentation_skeleton_metrics-3.0.5-py3-none-any.whl
Algorithm Hash digest
SHA256 24d2fa22ca81f78c71894892294b19f4bb4fb000fbb344f1be9cf5b1adef7f92
MD5 027e4504347bf7b0bc3ca0f6369588d0
BLAKE2b-256 8c35468b15efa9a40af1062e89d35e946a60b6695c7442a07ea88aec586ebc48

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page