Skip to main content

SegGPT for use with Autodistill

Project description

Autodistill SegGPT Module

This repository contains the code supporting the SegGPT base model for use with Autodistill.

SegGPT is a transformer-based, few-shot semantic segmentation model developed by BAAI Vision.

This model performs well on task-specific segmentation tasks when given a few labeled images from which to learn features about the objects you want to identify.

Read the full Autodistill documentation.

Read the SegGPT Autodistill documentation.

Installation

To use SegGPT with Autodistill, you need to install the following dependency:

pip3 install autodistill-seggpt

About SegGPT

SegGPT performs "in-context" segmentation. This means it requires a handful of pre-labelled "context" images.

You will need some labeled images to use SegGPT. Don't have any labeled images? Check out Roboflow Annotate, a feature-rich annotation tool from which you can export data for use with Autodistill.

Quickstart

from autodistill_seggpt import SegGPT, FewShotOntology

base_model = SegGPT(
    ontology=FewShotOntology(supervision_dataset)
)

base_model.label("./unlabelled-photos", extension=".jpg")

How to load data from Roboflow

Labelling and importing images is easy!

You can use Roboflow Annotate to label a few images (1-3 should work fine). For your Project Type, make sure to pick Instance Segmentation, as you will be labelling with polygons.

Once you have labelled your images, you can press Generate > Generate New Version. You can use all the default options--no Augmentations are necessary.

Once your dataset version is generated, you can press Export > Continue.

Then you will get some download code to copy. It should look something like this:

!pip install roboflow

from roboflow import Roboflow
rf = Roboflow(api_key="ABCDEFG")
project = rf.workspace("lorem-ipsum").project("dolor-sit-amet")
dataset = project.version(1).download("yolov8")

Note: if you are not using a notebook environment, you should remove !pip install roboflow from your code, and run pip install roboflow in your terminal instead.

To import your dataset into Autodistill, run the following:

import supervision as sv

supervision_dataset = sv.DetectionDataset.from_yolo(
    images_directory_path=f"{dataset.location}/train/images",
    annotations_directory_path=f"{dataset.location}/train/labels",
    data_yaml_path=f"{dataset.location}/data.yaml",
    force_masks=True
)

License

The code in this repository is licensed under an MIT license.

See the SegGPT repository for more information on the SegGPT license.

🏆 Contributing

We love your input! Please see the core Autodistill contributing guide to get started. Thank you 🙏 to all our contributors!

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

autodistill-seggpt-0.1.5.tar.gz (22.0 kB view details)

Uploaded Source

Built Distribution

autodistill_seggpt-0.1.5-py3-none-any.whl (29.6 kB view details)

Uploaded Python 3

File details

Details for the file autodistill-seggpt-0.1.5.tar.gz.

File metadata

  • Download URL: autodistill-seggpt-0.1.5.tar.gz
  • Upload date:
  • Size: 22.0 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/4.0.2 CPython/3.7.12

File hashes

Hashes for autodistill-seggpt-0.1.5.tar.gz
Algorithm Hash digest
SHA256 0e61a9968fd3ee69b98e126438aef349ff63033125efea11f12c68a67e5ea6bb
MD5 f445fb68603f3a06e8640ccddc7ebb0e
BLAKE2b-256 d9edca54615342f9ed0fe78424afa0e5bb14d2ae40abab9601b7e0d530ec85a5

See more details on using hashes here.

File details

Details for the file autodistill_seggpt-0.1.5-py3-none-any.whl.

File metadata

File hashes

Hashes for autodistill_seggpt-0.1.5-py3-none-any.whl
Algorithm Hash digest
SHA256 ba1b5e134ff0054ae8ffd1922a974106fbf971a2bf0b8139b5f225ff2dcee45b
MD5 0cfd84ceb637587daca9c8be4979473c
BLAKE2b-256 e50eb233a5cd7995536e6c66e8932baa735073122eeb9cdb3bf07c02aedd2d60

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page