Skip to main content

SegGPT for use with Autodistill

Project description

Autodistill SegGPT Module

This repository contains the code supporting the SegGPT base model for use with Autodistill.

SegGPT is a transformer-based, few-shot semantic segmentation model developed by BAAI Vision.

This model performs well on task-specific segmentation tasks when given a few labeled images from which to learn features about the objects you want to identify.

Read the full Autodistill documentation.

Read the SegGPT Autodistill documentation.

Installation

To use SegGPT with Autodistill, you need to install the following dependency:

pip3 install autodistill-seggpt

About SegGPT

SegGPT performs "in-context" segmentation. This means it requires a handful of pre-labelled "context" images.

You will need some labeled images to use SegGPT. Don't have any labeled images? Check out Roboflow Annotate, a feature-rich annotation tool from which you can export data for use with Autodistill.

Quickstart

from autodistill_seggpt import SegGPT, FewShotOntology

base_model = SegGPT(
    ontology=FewShotOntology(supervision_dataset)
)

base_model.label("./unlabelled-photos", extension=".jpg")

How to load data from Roboflow

Labelling and importing images is easy!

You can use Roboflow Annotate to label a few images (1-3 should work fine). For your Project Type, make sure to pick Instance Segmentation, as you will be labelling with polygons.

Once you have labelled your images, you can press Generate > Generate New Version. You can use all the default options--no Augmentations are necessary.

Once your dataset version is generated, you can press Export > Continue.

Then you will get some download code to copy. It should look something like this:

!pip install roboflow

from roboflow import Roboflow
rf = Roboflow(api_key="ABCDEFG")
project = rf.workspace("lorem-ipsum").project("dolor-sit-amet")
dataset = project.version(1).download("yolov8")

Note: if you are not using a notebook environment, you should remove !pip install roboflow from your code, and run pip install roboflow in your terminal instead.

To import your dataset into Autodistill, run the following:

import supervision as sv

supervision_dataset = sv.DetectionDataset.from_yolo(
    images_directory_path=f"{dataset.location}/train/images",
    annotations_directory_path=f"{dataset.location}/train/labels",
    data_yaml_path=f"{dataset.location}/data.yaml",
    force_masks=True
)

License

The code in this repository is licensed under an MIT license.

See the SegGPT repository for more information on the SegGPT license.

🏆 Contributing

We love your input! Please see the core Autodistill contributing guide to get started. Thank you 🙏 to all our contributors!

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

autodistill-seggpt-0.1.1.tar.gz (18.7 kB view details)

Uploaded Source

Built Distribution

autodistill_seggpt-0.1.1-py3-none-any.whl (23.0 kB view details)

Uploaded Python 3

File details

Details for the file autodistill-seggpt-0.1.1.tar.gz.

File metadata

  • Download URL: autodistill-seggpt-0.1.1.tar.gz
  • Upload date:
  • Size: 18.7 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/4.0.2 CPython/3.7.12

File hashes

Hashes for autodistill-seggpt-0.1.1.tar.gz
Algorithm Hash digest
SHA256 29f587774487dea50d88edc1d84d99ce0f959381e7b6ef794e22f5c601b8e4c0
MD5 6ce242846a1dff18160aaa62835bbbac
BLAKE2b-256 b07ec890f01d6dee12418653b101cf93a8a1c957569fac6915aed3f221a694f1

See more details on using hashes here.

File details

Details for the file autodistill_seggpt-0.1.1-py3-none-any.whl.

File metadata

File hashes

Hashes for autodistill_seggpt-0.1.1-py3-none-any.whl
Algorithm Hash digest
SHA256 30b1d66ad36410c52c83a007ee6c950039111a8d1735890a1875974b588d14f2
MD5 1200841c613520db1adfc15d5ae6d1be
BLAKE2b-256 156ba1e5bfdf3e30d387b1b46e4b216d222b3b9ecc05c3ea8d54b8ea9ca93dfe

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page