Skip to main content

Prompt based automatic annotation

Project description

pbaa : Prompt-Based Automatic Annotation

PyPI - Version PyPI - Python Version

Easy inference implementation of Grounded-SAM for Prompt-based automatic annotation


Table of Contents

Installation

Docker (Recommend)

git clone https://github.com/dh031200/pbaa.git
docker build docker -t pbaa:latest
docker run --gpus all -it --ipc=host -v `pwd`:/workspace -p 7860:7860 pbaa:latest

Without docker

The code requires python>=3.8, CUDA==11.7.

pip install pbaa

Usage

Options

Usage: pbaa [OPTIONS]

Options:
  --version                    Show the version and exit.
  -s, --src TEXT               Source image or directory path
  -p, --prompt <TEXT TEXT>...  Space-separated a pair of prompt and target
                               classe. (Multi)
  -b, --box_threshold FLOAT    Threshold for Object Detection (default: 0.25)
  -n, --nms_threshold FLOAT    Threshold for NMS (default: 0.8)
  -o, --output_dir TEXT        Path to result data (default: 'outputs')
  -g, --gradio                 Launch gradio app
  -h, --help                   Show this message and exit.

CLI

# pbaa -s <Source> -p <prompt> <class> -p <prompt> <class> ...

pbaa -s source_image.jpg -p "black dog" dog
pbaa -s source_image.jpg -p "black dog" dog -p "white cat" cat

Python

from pbaa import PBAA

annotator = PBAA()
# inference(<Source path>, <prompt:class dict>, box_threshold=0.25, nms_threshold=0.8, save=None, output_dir="outputs")
annotator("path/to/source_image.jpg", {"black dog": "dog", "white cat": "cat"})

Gradio

Run the gradio demo with a simple command

pbaa -g

Output

Launch gradio app
Running on local URL:  http://0.0.0.0:7860

You can now access Gradio demos using your browser. localhost:7860

gradio_demo

Demo

# Source : assets/demo9.jpg
# prompts : {"plant" : "plant", "picture" : "picture", "dog": "dog", "lamp" : "lamp", "carpet" : "carpet", "sofa" : "sofa"}

pbaa -s assets/demo9.jpg -p plant plant -p picture picture -p dog dog -p lamp lamp -p carpet carpet -p sofa sofa
Origin Detection Segmentation
Before detection segmentation

Result data

demo9.json

json structure

filename
prompt
index
  ├ cls : class name
  ├ conf : confidence score
  ├ box : bounding box coordinates
  └ poly : polygon coordinates

License

pbaa is distributed under the terms of the Apache-2.0 license.

Acknowledgements

Grounded-Segment-Anything : https://github.com/IDEA-Research/Grounded-Segment-Anything
Grounding DINO : https://github.com/IDEA-Research/GroundingDINO
Segment-anything : https://github.com/facebookresearch/segment-anything

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

pbaa-1.0.4.tar.gz (9.0 MB view hashes)

Uploaded Source

Built Distribution

pbaa-1.0.4-py3-none-any.whl (15.2 kB view hashes)

Uploaded Python 3

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page