Skip to main content

Create annotations for instance segmentation using Segment Anything models

Project description

napari-SAM4IS

License Apache Software License 2.0 PyPI Python Version tests codecov napari hub

napari plugin for instance and semantic segmentation annotation using Segment Anything Model (SAM)

This is a plugin for napari, a multi-dimensional image viewer for Python, that allows for instance and semantic segmentation annotation. This plugin provides an easy-to-use interface for annotating images with the option to output annotations as COCO format.


This napari plugin was generated with Cookiecutter using @napari's cookiecutter-napari-plugin template.

Installation

Requirements: Python 3.10-3.13

Step 1: Install napari-SAM4IS

You can install napari-SAM4IS via pip:

pip install napari-SAM4IS

Or via conda:

conda install -c conda-forge napari-SAM4IS

To install the latest development version:

pip install git+https://github.com/hiroalchem/napari-SAM4IS.git

Step 2: Install Segment Anything Model (Optional - for local model usage)

Note: Installing the Segment Anything Model is only required if you plan to use local models. If you're using the API mode, you can skip this step.

To use local models, install SAM:

pip install git+https://github.com/facebookresearch/segment-anything.git

Or you can install from source by cloning the repository:

git clone https://github.com/facebookresearch/segment-anything.git
cd segment-anything
pip install -e .

For more detailed instructions, please refer to the SAM installation guide.

Usage

Preparation

  1. Open an image in napari and launch the plugin. (Opening an image after launching the plugin is also possible.)
  2. Upon launching the plugin, several layers will be automatically created: SAM-Box, SAM-Positive, SAM-Negative, SAM-Predict, and Accepted. The usage of these layers will be explained later.
  3. Choose between local model or API mode:
    • Local Model Mode: Select the model you want to use and click the load button. (The default option is recommended.)
    • API Mode: Check the "Use API" checkbox, then enter your API URL and API Key. No model loading is required.
  4. Next, select the image layer you want to annotate.
  5. Then, select whether you want to do instance segmentation or semantic segmentation. (Note that for 3D images, semantic segmentation should be chosen in the current version.)
  6. Finally, select the output layer as "shapes" for instance segmentation or "labels" for semantic segmentation. (For instance segmentation, the "Accept" layer can also be used.)

Class Management

You can define annotation classes to assign to each segmented object.

  1. In the Class Management section, type a class name and click Add (or press Enter) to add a new class.
  2. Click a class in the list to select it. The selected class will be assigned to subsequent annotations.
  3. To reassign a class, select an existing annotation in the output Shapes layer, then click the desired class.
  4. Classes in use cannot be deleted. Remove the associated annotations first.
  5. You can load class definitions from a YAML file (click Load). The expected format is:
    names:
      0: cat
      1: dog
      2: bird
    
  6. Class definitions are automatically saved as class.yaml alongside the COCO JSON output.

Annotation with SAM

  1. Select the SAM-Box layer and use the rectangle tool to enclose the object you want to segment.
  2. An automatic segmentation mask will be created and output to the SAM-Predict layer.
  3. You can refine the prediction by adding point prompts: click on the SAM-Positive layer to add points that should be included, or on the SAM-Negative layer to add points that should be excluded.
  4. If you want to make further adjustments, do so in the SAM-Predict layer.
  5. To accept or reject the annotation, press A or R on the keyboard, respectively.
  6. If you accept the annotation, it will be output as label 1 for semantic segmentation or converted to a polygon and output to the designated layer for instance segmentation. The currently selected class will be assigned to the annotation.
  7. If you reject the annotation, the segmentation mask in the SAM-Predict layer will be discarded.
  8. After accepting or rejecting the annotation, the SAM-Predict layer will automatically reset to blank and return to the SAM-Box layer.

Manual Annotation (without SAM)

You can also annotate without using SAM by enabling Manual Mode.

  1. Check the Manual Mode checkbox. SAM-related controls and layers will be hidden.
  2. The SAM-Predict layer switches to paint mode. Use napari's standard Labels tools (paint brush, eraser, fill) from the layer controls panel to draw your annotation.
  3. Adjust brush size using napari's standard Labels controls.
  4. Press A to accept or R to reject, just like SAM mode.
  5. After accepting, the painted mask is converted to a polygon (instance mode) or merged into the output Labels layer (semantic mode), with the selected class assigned.

Annotation Attributes

Each annotation can have additional attributes to support quality control workflows.

  1. Select one or more annotations in the output Shapes layer.
  2. In the Attributes panel, you can set:
    • Unclear: Mark annotations where the object boundary is ambiguous.
    • Uncertain: Mark annotations where the object class is uncertain.
    • Reviewed at: Automatically records a timestamp when an annotation is reviewed. Click Mark reviewed to set, or Clear to reset.
  3. Attributes are saved as part of the COCO JSON output under each annotation's "attributes" field.
  4. When multiple annotations are selected with mixed attribute values, checkboxes show a mixed state indicator.

Saving and Loading Annotations

  1. If you have output to the labels layer, use napari's standard functionality to save the mask.
  2. If you have output to the shapes layer, you can save the shapes layer using napari's standard functionality, or you can click the Save button to output a JSON file in COCO format for each image in the folder. (The JSON file will have the same name as the image.) Class definitions will also be saved as class.yaml in the same directory.
  3. To load previously saved annotations, click the Load button and select a COCO JSON file. Annotations, class definitions, and attributes will be restored.
  4. When switching images via the Image ComboBox, the plugin will:
    • Prompt to save unsaved annotations (Save / Discard / Cancel)
    • Automatically clear the output layer
    • Auto-load annotations from a matching JSON file if one exists

Contributing

Contributions are very welcome. Tests can be run with tox, please ensure the coverage at least stays the same before you submit a pull request.

License

Distributed under the terms of the Apache Software License 2.0 license, "napari-SAM4IS" is free and open source software

Issues

If you encounter any problems, please file an issue along with a detailed description.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

napari_sam4is-0.1.4.tar.gz (30.5 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

napari_sam4is-0.1.4-py3-none-any.whl (31.1 kB view details)

Uploaded Python 3

File details

Details for the file napari_sam4is-0.1.4.tar.gz.

File metadata

  • Download URL: napari_sam4is-0.1.4.tar.gz
  • Upload date:
  • Size: 30.5 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.1.0 CPython/3.13.7

File hashes

Hashes for napari_sam4is-0.1.4.tar.gz
Algorithm Hash digest
SHA256 f26dcb16f1911e69ce6698dd9123e5a6918b41a65d2de55d4c313a09b533a0c7
MD5 4a7b57aaa8e0c07f008b0a42e35fec21
BLAKE2b-256 8cb3725b4889e594804ee7a42974b86e4b07a0be752c9c7398d4d4264db2c3ed

See more details on using hashes here.

File details

Details for the file napari_sam4is-0.1.4-py3-none-any.whl.

File metadata

  • Download URL: napari_sam4is-0.1.4-py3-none-any.whl
  • Upload date:
  • Size: 31.1 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.1.0 CPython/3.13.7

File hashes

Hashes for napari_sam4is-0.1.4-py3-none-any.whl
Algorithm Hash digest
SHA256 36501a2ed2f7961f61ab05b7ca39379a210dc3c33e037a6908f1e5f93490dad3
MD5 c6f6a2662d2dd4a61a406e9688f0307e
BLAKE2b-256 026aa7f20c2ec5c7046b154e9d74635563db180e831d3edb6075397b380596fe

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page