Skip to main content

DistilBERT model for use with Autodistill

Project description

Autodistill DistilBERT Module

This repository contains the code supporting the DistilBERT target model for use with Autodistill.

DistilBERT is a languae model architecture commonly used in training sentence classification models. You can use autodistill to train a DistilBERT model that classifies text.

Installation

To use the DistilBERT target model, you will need to install the following dependency:

pip3 install autodistill-distilbert-text

Quickstart

The DistilBERT module takes in .jsonl files and trains a text classification model.

Each record in the JSONL file should have an entry called text that contains the text to be classified. The label entry should contain the ground truth label for the text. This format is returned by Autodistill base text classification models like the GPTClassifier.

Here is an example entry of a record used to train a research paper subject classifier:

{"title": "CC-GPX: Extracting High-Quality Annotated Geospatial Data from Common Crawl", "content": "arXiv:2405.11039v1 Announce Type: new \nAbstract: The Common Crawl (CC) corpus....", "classification": "natural language processing"}
from autodistill_distilbert import DistilBERT

target_model = DistilBERT()

# train a model
target_model.train("./data.jsonl", epochs=200)

# run inference on the new model
pred = target_model.predict("Geospatial data.", conf=0.01)

print(pred)
# geospatial

License

This project is licensed under an MIT license.

🏆 Contributing

We love your input! Please see the core Autodistill contributing guide to get started. Thank you 🙏 to all our contributors!

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

autodistill_distilbert-0.1.0.tar.gz (5.9 kB view details)

Uploaded Source

Built Distribution

autodistill_distilbert-0.1.0-py3-none-any.whl (4.7 kB view details)

Uploaded Python 3

File details

Details for the file autodistill_distilbert-0.1.0.tar.gz.

File metadata

File hashes

Hashes for autodistill_distilbert-0.1.0.tar.gz
Algorithm Hash digest
SHA256 5e0da1b95cc011476a2097262d31dbba1901a0c17d715d3623561034a2dec032
MD5 c6a07fbad36fe54b00157c8754c5f43d
BLAKE2b-256 03128653af1b45dfe792d7de758946c0a3140207b121b4792b255891576cf95d

See more details on using hashes here.

File details

Details for the file autodistill_distilbert-0.1.0-py3-none-any.whl.

File metadata

File hashes

Hashes for autodistill_distilbert-0.1.0-py3-none-any.whl
Algorithm Hash digest
SHA256 53b6c9a28d8b6cc593c48f053bd20672ade129c16c2457f7d6f499f6cdf339d4
MD5 75f4a830020126e51a76be89add8284b
BLAKE2b-256 ffc162fa884b07542e872d7f437a63b6cd186a64815ec744e9741b665080e7f0

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page