Create annotations for instance segmentation using Segment Anything models
Project description
napari-SAM4IS
napari plugin for instance and semantic segmentation annotation using Segment Anything Model (SAM)
This is a plugin for napari, a multi-dimensional image viewer for Python, that allows for instance and semantic segmentation annotation. This plugin provides an easy-to-use interface for annotating images with the option to output annotations as COCO format.
This napari plugin was generated with Cookiecutter using @napari's cookiecutter-napari-plugin template.
Installation
Requirements: Python 3.10-3.13
Step 1: Install napari-SAM4IS
You can install napari-SAM4IS via pip:
pip install napari-SAM4IS
Or via conda:
conda install -c conda-forge napari-SAM4IS
To install the latest development version:
pip install git+https://github.com/hiroalchem/napari-SAM4IS.git
Step 2: Install Segment Anything Model (Optional - for local model usage)
Note: Installing the Segment Anything Model is only required if you plan to use local models. If you're using the API mode, you can skip this step.
To use local models, install SAM:
pip install git+https://github.com/facebookresearch/segment-anything.git
Or you can install from source by cloning the repository:
git clone https://github.com/facebookresearch/segment-anything.git
cd segment-anything
pip install -e .
For more detailed instructions, please refer to the SAM installation guide.
Usage
Preparation
- Open an image in napari and launch the plugin. (Opening an image after launching the plugin is also possible.)
- Upon launching the plugin, several layers will be automatically created: SAM-Box, SAM-Positive, SAM-Negative, SAM-Predict, and Accepted. The usage of these layers will be explained later.
- Choose between local model or API mode:
- Local Model Mode: Select the model you want to use and click the load button. (The default option is recommended.)
- API Mode: Check the "Use API" checkbox, then enter your API URL and API Key. No model loading is required.
- Next, select the image layer you want to annotate.
- Then, select whether you want to do instance segmentation or semantic segmentation. (Note that for 3D images, semantic segmentation should be chosen in the current version.)
- Finally, select the output layer as "shapes" for instance segmentation or "labels" for semantic segmentation. (For instance segmentation, the "Accept" layer can also be used.)
Class Management
You can define annotation classes to assign to each segmented object.
- In the Class Management section, type a class name and click Add (or press Enter) to add a new class.
- Click a class in the list to select it. The selected class will be assigned to subsequent annotations.
- To reassign a class, select an existing annotation in the output Shapes layer, then click the desired class.
- Classes in use cannot be deleted. Remove the associated annotations first.
- You can load class definitions from a YAML file (click Load). The expected format is:
names: 0: cat 1: dog 2: bird
- Class definitions are automatically saved as
class.yamlalongside the COCO JSON output.
Annotation with SAM
- Select the SAM-Box layer and use the rectangle tool to enclose the object you want to segment.
- An automatic segmentation mask will be created and output to the SAM-Predict layer.
- You can refine the prediction by adding point prompts: click on the SAM-Positive layer to add points that should be included, or on the SAM-Negative layer to add points that should be excluded.
- If you want to make further adjustments, do so in the SAM-Predict layer.
- To accept or reject the annotation, press A or R on the keyboard, respectively.
- If you accept the annotation, it will be output as label 1 for semantic segmentation or converted to a polygon and output to the designated layer for instance segmentation. The currently selected class will be assigned to the annotation.
- If you reject the annotation, the segmentation mask in the SAM-Predict layer will be discarded.
- After accepting or rejecting the annotation, the SAM-Predict layer will automatically reset to blank and return to the SAM-Box layer.
Manual Annotation (without SAM)
You can also annotate without using SAM by enabling Manual Mode.
- Check the Manual Mode checkbox. SAM-related controls and layers will be hidden.
- The SAM-Predict layer switches to paint mode. Use napari's standard Labels tools (paint brush, eraser, fill) from the layer controls panel to draw your annotation.
- Adjust brush size using napari's standard Labels controls.
- Press A to accept or R to reject, just like SAM mode.
- After accepting, the painted mask is converted to a polygon (instance mode) or merged into the output Labels layer (semantic mode), with the selected class assigned.
Annotation Attributes
Each annotation can have additional attributes to support quality control workflows.
- Select one or more annotations in the output Shapes layer.
- In the Attributes panel, you can set:
- Unclear: Mark annotations where the object boundary is ambiguous.
- Uncertain: Mark annotations where the object class is uncertain.
- Reviewed at: Automatically records a timestamp when an annotation is reviewed. Click Mark reviewed to set, or Clear to reset.
- Attributes are saved as part of the COCO JSON output under each annotation's
"attributes"field. - When multiple annotations are selected with mixed attribute values, checkboxes show a mixed state indicator.
Saving and Loading Annotations
- If you have output to the labels layer, use napari's standard functionality to save the mask.
- If you have output to the shapes layer, you can save the shapes layer using napari's standard functionality, or you can click the Save button to output a JSON file in COCO format for each image in the folder. (The JSON file will have the same name as the image.) Class definitions will also be saved as
class.yamlin the same directory. - To load previously saved annotations, click the Load button and select a COCO JSON file. Annotations, class definitions, and attributes will be restored.
- When switching images via the Image ComboBox, the plugin will:
- Prompt to save unsaved annotations (Save / Discard / Cancel)
- Automatically clear the output layer
- Auto-load annotations from a matching JSON file if one exists
Contributing
Contributions are very welcome. Tests can be run with tox, please ensure the coverage at least stays the same before you submit a pull request.
License
Distributed under the terms of the Apache Software License 2.0 license, "napari-SAM4IS" is free and open source software
Issues
If you encounter any problems, please file an issue along with a detailed description.
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file napari_sam4is-0.1.5.tar.gz.
File metadata
- Download URL: napari_sam4is-0.1.5.tar.gz
- Upload date:
- Size: 30.3 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.1.0 CPython/3.13.7
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
91eb38afdd326119ef07fa2ec80c89d55723956c4e19621f976fb086dc23f9ae
|
|
| MD5 |
57b57f0fbcb039d23103549039ce4b45
|
|
| BLAKE2b-256 |
61ff30ff538353296271a1bb1e58e685141405d180e33b3eaddc88c33f23291e
|
File details
Details for the file napari_sam4is-0.1.5-py3-none-any.whl.
File metadata
- Download URL: napari_sam4is-0.1.5-py3-none-any.whl
- Upload date:
- Size: 31.1 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.1.0 CPython/3.13.7
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
122b87312edf563d81c298f165143de5db48b28e73f54257ae77b82d0ec34c87
|
|
| MD5 |
2f66bab0946fde258f57b2432cf5e158
|
|
| BLAKE2b-256 |
75aaee6e7008a371f1447e9d367eadf402869fadcb0520974c7ceef6df78d16f
|