DistilBERT model for use with Autodistill
Project description
Autodistill DistilBERT Module
This repository contains the code supporting the DistilBERT target model for use with Autodistill.
DistilBERT is a languae model architecture commonly used in training sentence classification models. You can use autodistill
to train a DistilBERT model that classifies text.
Installation
To use the DistilBERT target model, you will need to install the following dependency:
pip3 install autodistill-distilbert-text
Quickstart
The DistilBERT module takes in .jsonl
files and trains a text classification model.
Each record in the JSONL file should have an entry called text
that contains the text to be classified. The label
entry should contain the ground truth label for the text. This format is returned by Autodistill base text classification models like the GPTClassifier.
Here is an example entry of a record used to train a research paper subject classifier:
{"title": "CC-GPX: Extracting High-Quality Annotated Geospatial Data from Common Crawl", "content": "arXiv:2405.11039v1 Announce Type: new \nAbstract: The Common Crawl (CC) corpus....", "classification": "natural language processing"}
from autodistill_distilbert import DistilBERT
target_model = DistilBERT()
# train a model
target_model.train("./data.jsonl", epochs=200)
# run inference on the new model
pred = target_model.predict("Geospatial data.", conf=0.01)
print(pred)
# geospatial
License
This project is licensed under an MIT license.
🏆 Contributing
We love your input! Please see the core Autodistill contributing guide to get started. Thank you 🙏 to all our contributors!
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
File details
Details for the file autodistill_distilbert-0.1.0.tar.gz
.
File metadata
- Download URL: autodistill_distilbert-0.1.0.tar.gz
- Upload date:
- Size: 5.9 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/4.0.2 CPython/3.11.7
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 | 5e0da1b95cc011476a2097262d31dbba1901a0c17d715d3623561034a2dec032 |
|
MD5 | c6a07fbad36fe54b00157c8754c5f43d |
|
BLAKE2b-256 | 03128653af1b45dfe792d7de758946c0a3140207b121b4792b255891576cf95d |
File details
Details for the file autodistill_distilbert-0.1.0-py3-none-any.whl
.
File metadata
- Download URL: autodistill_distilbert-0.1.0-py3-none-any.whl
- Upload date:
- Size: 4.7 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/4.0.2 CPython/3.11.7
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 | 53b6c9a28d8b6cc593c48f053bd20672ade129c16c2457f7d6f499f6cdf339d4 |
|
MD5 | 75f4a830020126e51a76be89add8284b |
|
BLAKE2b-256 | ffc162fa884b07542e872d7f437a63b6cd186a64815ec744e9741b665080e7f0 |