Skip to main content

SegGPT for use with Autodistill

Project description

Autodistill SegGPT Module

This repository contains the code supporting the SegGPT base model for use with Autodistill.

SegGPT is a transformer-based, few-shot semantic segmentation model developed by BAAI Vision.

This model performs well on task-specific segmentation tasks when given a few labeled images from which to learn features about the objects you want to identify.

Read the full Autodistill documentation.

Read the SegGPT Autodistill documentation.

Installation

To use SegGPT with Autodistill, you need to install the following dependency:

pip3 install autodistill-seggpt

About SegGPT

SegGPT performs "in-context" segmentation. This means it requires a handful of pre-labelled "context" images.

You will need some labeled images to use SegGPT. Don't have any labeled images? Check out Roboflow Annotate, a feature-rich annotation tool from which you can export data for use with Autodistill.

Quickstart

from autodistill_seggpt import SegGPT, FewShotOntology

base_model = SegGPT(
    ontology=FewShotOntology(supervision_dataset)
)

base_model.label("./unlabelled-photos", extension=".jpg")

How to load data from Roboflow

Labelling and importing images is easy!

You can use Roboflow Annotate to label a few images (1-3 should work fine). For your Project Type, make sure to pick Instance Segmentation, as you will be labelling with polygons.

Once you have labelled your images, you can press Generate > Generate New Version. You can use all the default options--no Augmentations are necessary.

Once your dataset version is generated, you can press Export > Continue.

Then you will get some download code to copy. It should look something like this:

!pip install roboflow

from roboflow import Roboflow
rf = Roboflow(api_key="ABCDEFG")
project = rf.workspace("lorem-ipsum").project("dolor-sit-amet")
dataset = project.version(1).download("yolov8")

Note: if you are not using a notebook environment, you should remove !pip install roboflow from your code, and run pip install roboflow in your terminal instead.

To import your dataset into Autodistill, run the following:

import supervision as sv

supervision_dataset = sv.DetectionDataset.from_yolo(
    images_directory_path=f"{dataset.location}/train/images",
    annotations_directory_path=f"{dataset.location}/train/labels",
    data_yaml_path=f"{dataset.location}/data.yaml",
    force_masks=True
)

License

The code in this repository is licensed under an MIT license.

See the SegGPT repository for more information on the SegGPT license.

🏆 Contributing

We love your input! Please see the core Autodistill contributing guide to get started. Thank you 🙏 to all our contributors!

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

autodistill-seggpt-0.1.0.tar.gz (18.6 kB view details)

Uploaded Source

Built Distribution

autodistill_seggpt-0.1.0-py3-none-any.whl (23.0 kB view details)

Uploaded Python 3

File details

Details for the file autodistill-seggpt-0.1.0.tar.gz.

File metadata

  • Download URL: autodistill-seggpt-0.1.0.tar.gz
  • Upload date:
  • Size: 18.6 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/4.0.2 CPython/3.7.12

File hashes

Hashes for autodistill-seggpt-0.1.0.tar.gz
Algorithm Hash digest
SHA256 2d96153d3637cf3ddf82caca0d0ca754622cfd9fe3da03efd9414964ba213b43
MD5 1b5c9f90350c71e1d5c35f42d2b6fcce
BLAKE2b-256 2e0715c11dc77c532fa545ab872b8934f84bacf56f90f043775127037a1192fd

See more details on using hashes here.

File details

Details for the file autodistill_seggpt-0.1.0-py3-none-any.whl.

File metadata

File hashes

Hashes for autodistill_seggpt-0.1.0-py3-none-any.whl
Algorithm Hash digest
SHA256 4014a3aea8cb1dd0a5d16576bd24046e32451f78c6bed2f9c1ab4f22003a75db
MD5 ae33fe79f958841cc51016413dd1754e
BLAKE2b-256 dd8057f58602da0f1ac211c210f4cd785185997715102d86424cb9f86739d2d8

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page