Skip to main content

Python package for evaluating neuron segmentations in terms of the number of splits and merges

Project description

segmentation-skeleton-metrics

License Code Style

semantic-release: angular

Python package for performing a skeleton-based evaluation of a predicted segmentation of neural arbors. This tool detects topological mistakes (i.e. splits and merges) in the predicted segmentation by comparing the ground truth skeleton to it. Once this comparison is complete, several statistics (e.g. edge accuracy, split count, merge count) are computed and returned in a dictionary.

Usage

Here is a simple example of evaluating a predicted segmentation. Note that this package supports a number of different input types, see documentation for details.

import os

from aind_segmentation_evaluation.evaluate import run_evaluation
from aind_segmentation_evaluation.conversions import volume_to_graph
from tifffile import imread


if __name__ == "__main__":

    # Initializations
    data_dir = "./resources"
    target_graphs_dir = os.path.join(data_dir, "target_graphs")
    path_to_target_labels = os.path.join(data_dir, "target_labels.tif")
    pred_labels = imread(os.path.join(data_dir, "pred_labels.tif"))
    pred_graphs = volume_to_graph(pred_labels)

    # Evaluation
    stats = run_evaluation(
        target_graphs_dir,
        path_to_target_labels,
        pred_graphs,
        pred_labels,
        filetype="tif",
        output="tif",
        output_dir=data_dir,
        permute=[2, 1, 0],
        scale=[1.101, 1.101, 1.101],
    )

    # Write out results
    print("Graph-based evaluation...")
    for key in stats.keys():
        print("   {}: {}".format(key, stats[key])

Installation

To use the software, in the root directory, run

pip install -e .

To develop the code, run

pip install -e .[dev]

To install this package from PyPI, run

pip install aind-segmentation-evaluation

Pull requests

For internal members, please create a branch. For external members, please fork the repository and open a pull request from the fork. We'll primarily use Angular style for commit messages. Roughly, they should follow the pattern:

<type>(<scope>): <short summary>

where scope (optional) describes the packages affected by the code changes and type (mandatory) is one of:

  • build: Changes that affect build tools or external dependencies (example scopes: pyproject.toml, setup.py)
  • ci: Changes to our CI configuration files and scripts (examples: .github/workflows/ci.yml)
  • docs: Documentation only changes
  • feat: A new feature
  • fix: A bugfix
  • perf: A code change that improves performance
  • refactor: A code change that neither fixes a bug nor adds a feature
  • test: Adding missing tests or correcting existing tests

Project details


Release history Release notifications | RSS feed

Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

segmentation_skeleton_metrics-2.0.5.tar.gz (114.5 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

segmentation_skeleton_metrics-2.0.5-py3-none-any.whl (13.8 kB view details)

Uploaded Python 3

File details

Details for the file segmentation_skeleton_metrics-2.0.5.tar.gz.

File metadata

File hashes

Hashes for segmentation_skeleton_metrics-2.0.5.tar.gz
Algorithm Hash digest
SHA256 b36f83267d9e23f4ff6c54fe1521ba2952edd57a3d7c2e6c1102b537a5e53230
MD5 0caa53e0780cad75ea9dcbf6b6225f1c
BLAKE2b-256 74ec33a5d2db4123deba76b5bf532170ccc6e02b48f9903f94524cb363c5b48a

See more details on using hashes here.

File details

Details for the file segmentation_skeleton_metrics-2.0.5-py3-none-any.whl.

File metadata

File hashes

Hashes for segmentation_skeleton_metrics-2.0.5-py3-none-any.whl
Algorithm Hash digest
SHA256 a6a62242f4a0a24a11b3d6680e9c884674aab053d77fad406b7b630b704c9b7e
MD5 cfc257f06b5a7ff667e6b3a592b1b9d8
BLAKE2b-256 c81a12375946fe8554d926e3ac368514902792d2eef37c90c31a316b35561561

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page