Skip to main content

SegGPT for use with Autodistill

Project description

Autodistill SegGPT Module

This repository contains the code supporting the SegGPT base model for use with Autodistill.

SegGPT is a transformer-based, few-shot semantic segmentation model developed by BAAI Vision.

This model performs well on task-specific segmentation tasks when given a few labeled images from which to learn features about the objects you want to identify.

Read the full Autodistill documentation.

Read the SegGPT Autodistill documentation.

Installation

To use SegGPT with Autodistill, you need to install the following dependency:

pip3 install autodistill-seggpt

About SegGPT

SegGPT performs "in-context" segmentation. This means it requires a handful of pre-labelled "context" images.

You will need some labeled images to use SegGPT. Don't have any labeled images? Check out Roboflow Annotate, a feature-rich annotation tool from which you can export data for use with Autodistill.

Quickstart

from autodistill_seggpt import SegGPT, FewShotOntology

base_model = SegGPT(
    ontology=FewShotOntology(supervision_dataset)
)

base_model.label("./unlabelled-photos", extension=".jpg")

How to load data from Roboflow

Labelling and importing images is easy!

You can use Roboflow Annotate to label a few images (1-3 should work fine). For your Project Type, make sure to pick Instance Segmentation, as you will be labelling with polygons.

Once you have labelled your images, you can press Generate > Generate New Version. You can use all the default options--no Augmentations are necessary.

Once your dataset version is generated, you can press Export > Continue.

Then you will get some download code to copy. It should look something like this:

!pip install roboflow

from roboflow import Roboflow
rf = Roboflow(api_key="ABCDEFG")
project = rf.workspace("lorem-ipsum").project("dolor-sit-amet")
dataset = project.version(1).download("yolov8")

Note: if you are not using a notebook environment, you should remove !pip install roboflow from your code, and run pip install roboflow in your terminal instead.

To import your dataset into Autodistill, run the following:

import supervision as sv

supervision_dataset = sv.DetectionDataset.from_yolo(
    images_directory_path=f"{dataset.location}/train/images",
    annotations_directory_path=f"{dataset.location}/train/labels",
    data_yaml_path=f"{dataset.location}/data.yaml",
    force_masks=True
)

License

The code in this repository is licensed under an MIT license.

See the SegGPT repository for more information on the SegGPT license.

🏆 Contributing

We love your input! Please see the core Autodistill contributing guide to get started. Thank you 🙏 to all our contributors!

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

autodistill-seggpt-0.1.6.tar.gz (22.0 kB view details)

Uploaded Source

Built Distribution

autodistill_seggpt-0.1.6-py3-none-any.whl (29.6 kB view details)

Uploaded Python 3

File details

Details for the file autodistill-seggpt-0.1.6.tar.gz.

File metadata

  • Download URL: autodistill-seggpt-0.1.6.tar.gz
  • Upload date:
  • Size: 22.0 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/4.0.2 CPython/3.7.12

File hashes

Hashes for autodistill-seggpt-0.1.6.tar.gz
Algorithm Hash digest
SHA256 dea22ee158f5ab791ef1b9ecab528df2532d7040f683d4b7ec684546c7164ded
MD5 f562197bbf07ba69905d92f0a28d64c0
BLAKE2b-256 aef5e4b64f982cbe2d57a77aee186f022db48ccbb7bbc58a9f647821f336432e

See more details on using hashes here.

File details

Details for the file autodistill_seggpt-0.1.6-py3-none-any.whl.

File metadata

File hashes

Hashes for autodistill_seggpt-0.1.6-py3-none-any.whl
Algorithm Hash digest
SHA256 f6519e05ad6ea0556b1cefc01a6adaa724b74aaf17e0a775282eeafcf86398d8
MD5 c1f2b061aa32d96e704cf99bd3ed22e5
BLAKE2b-256 4f2b8a12c23ef22c8fbaf0a9181f7a82123a5b59300a4e5e350017cc96557e12

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page