Skip to main content

Use Florence 2 to auto-label data for use in training fine-tuned object detection models.

Project description

Autodistill Florence 2 Module

This repository contains the code supporting the CLIP base model for use with Autodistill.

Florence 2, introduced in the paper Florence-2: Advancing a Unified Representation for a Variety of Vision Tasks is a multimodal vision model.

You can use Florence 2 to generate object detection annotations for use in training smaller object detection models with Autodistill.

Read the full Autodistill documentation.

Read the Florence 2 Autodistill documentation.

Installation

To use Florence 2 with Autodistill, you need to install the following dependency:

pip3 install autodistill-florence-2

Quickstart (Inference from Base Weights)

from autodistill_florence_2 import Florence2
from autodistill.detection import DetectionOntology
from PIL import Image

# define an ontology to map class names to our Florence 2 prompt
# the ontology dictionary has the format {caption: class}
# where caption is the prompt sent to the base model, and class is the label that will
# be saved for that caption in the generated annotations
# then, load the model
base_model = Florence2(
    ontology=CaptionOntology(
        {
            "person": "person",
            "a forklift": "forklift"
        }
    )
)

image = Image.open("image.jpeg")
result = base_model.predict('image.jpeg')

bounding_box_annotator = sv.BoundingBoxAnnotator()
annotated_frame = bounding_box_annotator.annotate(
    scene=image.copy(),
    detections=detections
)
sv.plot_image(image=annotated_frame, size=(16, 16))

# label a dataset
base_model.label("./context_images", extension=".jpeg")

Quickstart (Fine-Tune)

from autodistill_florence_2 import Florence2Trainer

model = Florence2Trainer("dataset")
model.train(dataset.location, epochs=10)

License

This project is licensed under an MIT license. See the Florence 2 license for more information about the Florence 2 model license.

🏆 Contributing

We love your input! Please see the core Autodistill contributing guide to get started. Thank you 🙏 to all our contributors!

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

autodistill_florence_2-0.1.1.tar.gz (5.8 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

autodistill_florence_2-0.1.1-py3-none-any.whl (5.9 kB view details)

Uploaded Python 3

File details

Details for the file autodistill_florence_2-0.1.1.tar.gz.

File metadata

  • Download URL: autodistill_florence_2-0.1.1.tar.gz
  • Upload date:
  • Size: 5.8 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/4.0.2 CPython/3.11.7

File hashes

Hashes for autodistill_florence_2-0.1.1.tar.gz
Algorithm Hash digest
SHA256 16574ed62f7fcc30df3284cd7324e2aef9cc994500154105daf6ac78ea8da209
MD5 676d5f1cd7e7e4cf8193eebd50b1163d
BLAKE2b-256 f4284324103897a3f7043d1c896cc59025b917a5bfef05e587223e99eaa102f5

See more details on using hashes here.

File details

Details for the file autodistill_florence_2-0.1.1-py3-none-any.whl.

File metadata

File hashes

Hashes for autodistill_florence_2-0.1.1-py3-none-any.whl
Algorithm Hash digest
SHA256 c4a559748a94d652c230950ab7f5e2e3b952f49b1dade38fd841e73e088053bc
MD5 d1acd1e0d366d6292ad14d427a6c7af9
BLAKE2b-256 186b05581c71620306e098ac5e27d3420741a79e55dddcc74d44c399c06c388b

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page