Object-Based Identification Segmentation Evaluation for cells
Project description
SEGOBE - Object-Based Evaluation of Segmentation Results
Segobe is a minimal, lightweight package for segmentation mask evaluation against a reference (ground-truth) mask. It performs cell matching and computes metrics like the intersection-over-union (IoU), Dice, and classifies errors as splits, merges, and catastrophes, based on the descriptions in Greenwald et al. 2022 and Schwartz et al. 2024, with the added functionality of cost matrix selection, or matching approach. Designed for cell segmentation evaluation, it can handle large batches of samples efficiently.
Installation
Option 1. Install directory from the repository (recommended for development)
If you plan to develop or modify Segobe, install it in editable mode:
# Clone the repository
git clone https://github.com/schapirolabor/segobe.git
cd segobe
# (Optional) create the conda environment
conda env create -f environment.yml
conda activate segobe_env
# Install in editable/development mode
pip install -e .
The -e flag (editable mode) means the changes to the source code are immediately reflected without reinstalling.
Option 2. Install directly from GitHub
Once the repository is made public, users can install it directly via URL:
pip install git+https://github.com/schapirolabor/segobe.git
Verify installation with
python -m segobe --help
or simply:
segobe --help
to see the CLI help message.
CLI
segobe \
--input samples.csv \
--output_dir results \
--basename testing \
--iou_threshold 0.5 \
--graph_iou_threshold 0.1 \
--unmatched_cost 0.4 \
--save_plots
| Argument | Long form | Description |
|---|---|---|
| -i | --input_csv | File path to CSV with columns: sampleID, ref_mask, eval_mask, category |
| -o | --output_dir | Directory to save output metrics and plots |
| -b | --basename | Unique basename used when saving metrics and plots |
| --iou_threshold | IoU threshold for cell matching (0-1, default: 0.5). Match is true if pair is selected with linear_sum_assignment and IoU above this threshold. | |
| --graph_iou_threshold | Graph IoU threshold for error detection (0-1, default: 0.1). Minimal IoU for cells to be considered 'connected'. | |
| --unmatched_cost | Cost for unmatched objects in the cost matrix (0-1, default: 0.4) | |
| --cost_matrix_metric | Specify which metric should be used for cost matrix construction (default: 'iou', other options 'dice', 'moc' - see details here) note that only IoU is currently supported |
|
| --save_plots | Boolean specifying whether plots (barplot grouped by category and row-specific error overview) are saved | |
| --plot_target_size | Size in pixels of the plot error types subfigures. If the inputs are larger, they will be approximately downsampled by a scale factor. If that scale factor is larger than 4, boundaries will not be drawn. (default: 600) | |
| --version | Prints tool version. |
Input format
Example of input CSV with potential usecase, comparing two methods across two samples (e.g. same ROI).
| sampleID | ref_mask | eval_mask | category |
|---|---|---|---|
| sample1 | path/to/groundtruth1.tif | path/to/prediction1_1.tif | method1 |
| sample1 | path/to/groundtruth1.tif | path/to/prediction1_2.tif | method2 |
| sample2 | path/to/groundtruth2.tif | path/to/prediction2_1.tif | method1 |
| sample2 | path/to/groundtruth2.tif | path/to/prediction2_2.tif | method2 |
Outputs
Expected outputs given the CLI example and input CSV:
results/
├── testing_metrics.csv # Full per-sample segmentation evaluation results
├── testing_summary.csv # Aggregated metrics summarized by category
├── plots/
│ ├── testing_metrics_barplot.png # Optional barplot of summary metrics (if --save_plots)
│ ├── testing_sample1_method1_error_plot.png # Optional per-sample error visualizations (if --save_plots)
│ ├── testing_sample2_method1_error_plot.svg
│ ├── testing_sample1_method2_error_plot.svg
│ └── testing_sample2_method2_error_plot.png
Captured metrics in the summary.csv across all inputs grouped per category:
- IoU: mean and std
- Dice: mean and std
- Precision: mean and std
- Recall: mean and std
- F1 score: mean and std
- Splits: counts
- Merges: counts
- Catastrophes: counts
Captured metrics in the metrics.csv for each input CSV row:
- IoU: mean and list of all values
- Dice: mean and list of all values
- Precision
- Recall
- F1 score
- Splits: counts and dictionary of matched predictions to GTs
- Merges: counts and dictionary of matched GTs to predictions
- Catastrophes: counts and dictionary of groups of GTs and predictions involved
- True postives: counts
- False positives: counts
- False negatives: counts
- FP_list: list of false positive labels
- FN_list: list of false negative labels
- TTP_gt: list of GT labels of true positives
- TTP_preds: list of prediction labels of true positives
- total_cells: count of GT labels
- total_pred_cells: count of prediction labels
Detailed description
Check the Detailed overview for a deeper clarification of tool functionality.
Contributing
Contributions, issues, and feature requests are welcome!
Feel free to open a pull request or submit an issue on GitHub Issues.
Before submitting a PR:
- Run tests (not yet applicable)
- Follow existing code style and documentation patterns
Citing
If you use Segobe in your work, please cite:
Bestak, K. Segobe: Object-Based Evaluation of Segmentation Results. Available at: https://github.com/schapirolabor/segobe
Note that for referencing the segmentation errors as used here, Greenwald et al. 2022 needs to be cited.
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file segobe-0.1.3.tar.gz.
File metadata
- Download URL: segobe-0.1.3.tar.gz
- Upload date:
- Size: 15.0 kB
- Tags: Source
- Uploaded using Trusted Publishing? Yes
- Uploaded via: twine/6.1.0 CPython/3.13.7
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
fea724442b785ca656a72bf11c5fdb33689ea5560ae3ec3907ec21ee2dd69fcb
|
|
| MD5 |
335b10839cfc5a49740faa86e1eebb42
|
|
| BLAKE2b-256 |
61c5cd8d59bfa329d8c145b90cf7a119f3ba0b090a7bf4446c828821ead1a942
|
Provenance
The following attestation bundles were made for segobe-0.1.3.tar.gz:
Publisher:
python-publish.yml on SchapiroLabor/Segobe
-
Statement:
-
Statement type:
https://in-toto.io/Statement/v1 -
Predicate type:
https://docs.pypi.org/attestations/publish/v1 -
Subject name:
segobe-0.1.3.tar.gz -
Subject digest:
fea724442b785ca656a72bf11c5fdb33689ea5560ae3ec3907ec21ee2dd69fcb - Sigstore transparency entry: 701617587
- Sigstore integration time:
-
Permalink:
SchapiroLabor/Segobe@2e3847b3d1ce0f52732cac8113fc7b7db4276e60 -
Branch / Tag:
refs/tags/0.1.3 - Owner: https://github.com/SchapiroLabor
-
Access:
public
-
Token Issuer:
https://token.actions.githubusercontent.com -
Runner Environment:
github-hosted -
Publication workflow:
python-publish.yml@2e3847b3d1ce0f52732cac8113fc7b7db4276e60 -
Trigger Event:
release
-
Statement type:
File details
Details for the file segobe-0.1.3-py3-none-any.whl.
File metadata
- Download URL: segobe-0.1.3-py3-none-any.whl
- Upload date:
- Size: 13.9 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? Yes
- Uploaded via: twine/6.1.0 CPython/3.13.7
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
756375fc0808c7b895e43600c3a6cfa1aeb011382b9f11171d9f3029fd46f535
|
|
| MD5 |
3ad6d304dc36ed0d558786f40c851afd
|
|
| BLAKE2b-256 |
d1dd2b5a3171641b9ede0a4990f655589af5e4edbb6e4ab215213b217b520735
|
Provenance
The following attestation bundles were made for segobe-0.1.3-py3-none-any.whl:
Publisher:
python-publish.yml on SchapiroLabor/Segobe
-
Statement:
-
Statement type:
https://in-toto.io/Statement/v1 -
Predicate type:
https://docs.pypi.org/attestations/publish/v1 -
Subject name:
segobe-0.1.3-py3-none-any.whl -
Subject digest:
756375fc0808c7b895e43600c3a6cfa1aeb011382b9f11171d9f3029fd46f535 - Sigstore transparency entry: 701617590
- Sigstore integration time:
-
Permalink:
SchapiroLabor/Segobe@2e3847b3d1ce0f52732cac8113fc7b7db4276e60 -
Branch / Tag:
refs/tags/0.1.3 - Owner: https://github.com/SchapiroLabor
-
Access:
public
-
Token Issuer:
https://token.actions.githubusercontent.com -
Runner Environment:
github-hosted -
Publication workflow:
python-publish.yml@2e3847b3d1ce0f52732cac8113fc7b7db4276e60 -
Trigger Event:
release
-
Statement type: