Skip to main content

Python package for evaluating neuron segmentations in terms of the number of splits and merges

Project description

segmentation-skeleton-metrics

License Code Style

semantic-release: angular

Python package for performing a skeleton-based evaluation of a predicted segmentation of neural arbors. This tool detects topological mistakes (i.e. splits and merges) in the predicted segmentation by comparing the ground truth skeleton to it. Once this comparison is complete, several statistics (e.g. edge accuracy, split count, merge count) are computed and returned in a dictionary.

Usage

Here is a simple example of evaluating a predicted segmentation. Note that this package supports a number of different input types, see documentation for details.

import os

from aind_segmentation_evaluation.evaluate import run_evaluation
from aind_segmentation_evaluation.conversions import volume_to_graph
from tifffile import imread


if __name__ == "__main__":

    # Initializations
    data_dir = "./resources"
    target_graphs_dir = os.path.join(data_dir, "target_graphs")
    path_to_target_labels = os.path.join(data_dir, "target_labels.tif")
    pred_labels = imread(os.path.join(data_dir, "pred_labels.tif"))
    pred_graphs = volume_to_graph(pred_labels)

    # Evaluation
    stats = run_evaluation(
        target_graphs_dir,
        path_to_target_labels,
        pred_graphs,
        pred_labels,
        filetype="tif",
        output="tif",
        output_dir=data_dir,
        permute=[2, 1, 0],
        scale=[1.101, 1.101, 1.101],
    )

    # Write out results
    print("Graph-based evaluation...")
    for key in stats.keys():
        print("   {}: {}".format(key, stats[key])

Installation

To use the software, in the root directory, run

pip install -e .

To develop the code, run

pip install -e .[dev]

To install this package from PyPI, run

pip install aind-segmentation-evaluation

Pull requests

For internal members, please create a branch. For external members, please fork the repository and open a pull request from the fork. We'll primarily use Angular style for commit messages. Roughly, they should follow the pattern:

<type>(<scope>): <short summary>

where scope (optional) describes the packages affected by the code changes and type (mandatory) is one of:

  • build: Changes that affect build tools or external dependencies (example scopes: pyproject.toml, setup.py)
  • ci: Changes to our CI configuration files and scripts (examples: .github/workflows/ci.yml)
  • docs: Documentation only changes
  • feat: A new feature
  • fix: A bugfix
  • perf: A code change that improves performance
  • refactor: A code change that neither fixes a bug nor adds a feature
  • test: Adding missing tests or correcting existing tests

Project details


Release history Release notifications | RSS feed

Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

segmentation_skeleton_metrics-2.1.1.tar.gz (115.4 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

segmentation_skeleton_metrics-2.1.1-py3-none-any.whl (14.8 kB view details)

Uploaded Python 3

File details

Details for the file segmentation_skeleton_metrics-2.1.1.tar.gz.

File metadata

File hashes

Hashes for segmentation_skeleton_metrics-2.1.1.tar.gz
Algorithm Hash digest
SHA256 33494c0f948ed5f7f2b7fa8e64ee6f4a24b0da0b25a1bacbce38bb5e20e3ace3
MD5 084bb033192cea3dfce39f9b49a1fbb2
BLAKE2b-256 0ab376965a13b986cef0d1df721cfa8e1e40b3fc75261c57461538e2b5a6a466

See more details on using hashes here.

File details

Details for the file segmentation_skeleton_metrics-2.1.1-py3-none-any.whl.

File metadata

File hashes

Hashes for segmentation_skeleton_metrics-2.1.1-py3-none-any.whl
Algorithm Hash digest
SHA256 3f28769db5015effb1af8194413f58c1d0d7c2e40592c3ef9f444d69d7404550
MD5 3af89178289c2baf291b949dd66b13dc
BLAKE2b-256 090c2637e4f41f5b0fe4b483c8e56612c72365ce22844a5a3b622017527fd119

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page