Skip to main content

Model for use with Autodistill

Project description

Autodistill GPT Module

This repository contains the code supporting the GPT (text) base model for use with Autodistill.

You can use Autodistill GPT to classify text using OpenAI's GPT models for use in training smaller, fine-tuned text classification models. You can also use Autodistill GPT to use LLaMAfile text generation models.

Read the full Autodistill documentation.

Installation

To use GPT or LLaMAfile models with Autodistill, you need to install the following dependency:

pip3 install autodistill-gpt-text

Quickstart (LLaMAfile)

from autodistill_gpt_text import GPTClassifier

# define an ontology to map class names to our GPT prompt
# the ontology dictionary has the format {caption: class}
# where caption is the prompt sent to the base model, and class is the label that will
# be saved for that caption in the generated annotations
# then, load the model
base_model = GPTClassifier(
    ontology=CaptionOntology(
        {
            "computer vision": "computer vision",
            "natural language processing": "nlp"
        }
    ),
    base_url = "http://localhost:8080/v1", # your llamafile server
    model_id="LLaMA_CPP"
)

# label a single text
result = GPTClassifier.predict("This is a blog post about computer vision.")

# label a JSONl file of texts
base_model.label("data.jsonl", output="output.jsonl")

Quickstart (GPT)

from autodistill_gpt_text import GPTClassifier

# define an ontology to map class names to our GPT prompt
# the ontology dictionary has the format {caption: class}
# where caption is the prompt sent to the base model, and class is the label that will
# be saved for that caption in the generated annotations
# then, load the model
base_model = GPTClassifier(
    ontology=CaptionOntology(
        {
            "computer vision": "computer vision",
            "natural language processing": "nlp"
        }
    )
)

# label a single text
result = GPTClassifier.predict("This is a blog post about computer vision.")

# label a JSONl file of texts
base_model.label("data.jsonl", output="output.jsonl")

The output JSONl file will contain all the data in your original file, with a new classification key in each entry that contains the predicted text label associated with that entry.

License

This project is licensed under an MIT license.

🏆 Contributing

We love your input! Please see the core Autodistill contributing guide to get started. Thank you 🙏 to all our contributors!

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

autodistill_gpt_text-0.1.0.tar.gz (3.5 kB view hashes)

Uploaded Source

Built Distribution

autodistill_gpt_text-0.1.0-py3-none-any.whl (3.7 kB view hashes)

Uploaded Python 3

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page