Promptzl - LLMs as Classifiers
Project description
Pr🥨mptzl
Turn state-of-the-art LLMs into zero+-shot PyTorch classifiers in just a few lines of code.
Promptzl offers:
- 🤖 Zero+-shot classification with LLMs
- 🤗 Turning causal and masked LMs into classifiers without any training
- 📦 Batch processing on your device for efficiency
- 🚀 Speed-up over calling an online API
- 🔎 Transparency and accessibility by using the model locally
- 📈 Distribution over labels
- ✂️ No need to extract the predictions from the answer.
For more information, check out the official documentation.
Installation
pip install -U promptzl
Getting Started
In just a few lines of code, you can transform a LLM of choice into an old-school classifier with all it's desirable properties:
Set up the dataset:
from datasets import Dataset
dataset = Dataset.from_dict(
{
'text': [
"The food was absolutely wonderful, from preparation to presentation, very pleasing.",
"The service was a bit slow, but the food made up for it. Highly recommend the pasta!",
"The restaurant was too noisy and the food was mediocre at best. Not worth the price.",
],
'label': [1, 1, 0]
}
)
Define a prompt for guiding the language model to the correct predictions:
from promptzl import FnVbzPair, Vbz
prompt = FnVbzPair(
lambda e: f"""Restaurant review classification into categories 'positive' or 'negative'.
'Best pretzls in town!'='positive'
'Rude staff, horrible food.'='negative'
'{e['text']}'=""",
Vbz({0: ["negative"], 1: ["positive"]}))
Initialize a model:
from promptzl import CausalLM4Classification
model = CausalLM4Classification(
'HuggingFaceTB/SmolLM2-1.7B',
prompt=prompt)
Classify the data:
from sklearn.metrics import accuracy_score
output = model.classify(dataset, show_progress_bar=True, batch_size=1)
accuracy_score(dataset['label'], output.predictions)
1.0
For more detailed tutorials, check out the documentation!
Project details
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file promptzl-1.0.0.tar.gz.
File metadata
- Download URL: promptzl-1.0.0.tar.gz
- Upload date:
- Size: 25.4 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.0.0 CPython/3.12.7
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
88687944e7a2f7e0cebae61d31e918c7d4d59d2d877d8382a275fa030951cf9f
|
|
| MD5 |
863e895ff3d035e9117422d18ee4ae24
|
|
| BLAKE2b-256 |
a4e70421bb579b961f189316658594b670412479220e6ddc30a363a9511b231f
|
File details
Details for the file promptzl-1.0.0-py3-none-any.whl.
File metadata
- Download URL: promptzl-1.0.0-py3-none-any.whl
- Upload date:
- Size: 20.5 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.0.0 CPython/3.12.7
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
38fef5923093501dc852f7238e90d9f2e5c18aece27eea2ada183ab662498450
|
|
| MD5 |
220c006d9052200126fef7f2adc7a4ed
|
|
| BLAKE2b-256 |
15a0379ae2619a2afabaee69404f2e751cfe62f63512217030f781c7c9f2e493
|