Skip to main content

Transformer-based zero and few-shot classification in scikit-learn pipelines

Project description

stormtrooper


Transformer-based zero/few shot learning components for scikit-learn pipelines.

Documentation

New in version 0.3.0 🌟

  • SetFit is now part of the library and can be used in scikit-learn workflows.

Example

pip install stormtrooper
class_labels = ["atheism/christianity", "astronomy/space"]
example_texts = [
    "God came down to earth to save us.",
    "A new nebula was recently discovered in the proximity of the Oort cloud."
]

Zero-shot learning

For zero-shot learning you can use zero-shot models:

from stormtrooper import ZeroShotClassifier
classifier = ZeroShotClassifier().fit(None, class_labels)

Generative models (GPT, Llama):

from stormtrooper import GenerativeZeroShotClassifier
# You can hand-craft prompts if it suits you better, but
# a default prompt is already available
prompt = """
### System:
You are a literary expert tasked with labeling texts according to
their content.
Please follow the user's instructions as precisely as you can.
### User:
Your task will be to classify a text document into one
of the following classes: {classes}.
Please respond with a single label that you think fits
the document best.
Classify the following piece of text:
'{X}'
### Assistant:
"""
classifier = GenerativeZeroShotClassifier(prompt=prompt).fit(None, class_labels)

Text2Text models (T5): If you are running low on resources I would personally recommend T5.

from stormtrooper import Text2TextZeroShotClassifier
# You can define a custom prompt, but a default one is available
prompt = "..."
classifier =Text2TextZeroShotClassifier(prompt=prompt).fit(None, class_labels)
predictions = classifier.predict(example_texts)

assert list(predictions) == ["atheism/christianity", "astronomy/space"]

Few-Shot Learning

For few-shot tasks you can only use Generative, Text2Text (aka. promptable) or SetFit models.

from stormtrooper import GenerativeFewShotClassifier, Text2TextFewShotClassifier, SetFitFewShotClassifier

classifier = SetFitFewShotClassifier().fit(example_texts, class_labels)
predictions = model.predict(["Calvinists believe in predestination."])

assert list(predictions) == ["atheism/christianity"]

Fuzzy Matching

Generative and text2text models by default will fuzzy match results to the closest class label, you can disable this behavior by specifying fuzzy_match=False.

If you want fuzzy matching speedup, you should install python-Levenshtein.

Inference on GPU

From version 0.2.2 you can run models on GPU. You can specify the device when initializing a model:

classifier = Text2TextZeroShotClassifier(device="cuda:0")

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

stormtrooper-0.3.2.tar.gz (8.7 kB view details)

Uploaded Source

Built Distribution

stormtrooper-0.3.2-py3-none-any.whl (11.4 kB view details)

Uploaded Python 3

File details

Details for the file stormtrooper-0.3.2.tar.gz.

File metadata

  • Download URL: stormtrooper-0.3.2.tar.gz
  • Upload date:
  • Size: 8.7 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: poetry/1.3.2 CPython/3.10.8 Linux/5.15.0-79-generic

File hashes

Hashes for stormtrooper-0.3.2.tar.gz
Algorithm Hash digest
SHA256 106f452e58c60138b481c93375f684806cca3c4e2a868c1bc28783f16bd3d681
MD5 d2a498a805dbae1fb5eacfbdf7baa577
BLAKE2b-256 83a2905a61b0f1d510864c58c72f75edf216d9b558742b2d77cbf2387cff97a3

See more details on using hashes here.

File details

Details for the file stormtrooper-0.3.2-py3-none-any.whl.

File metadata

  • Download URL: stormtrooper-0.3.2-py3-none-any.whl
  • Upload date:
  • Size: 11.4 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: poetry/1.3.2 CPython/3.10.8 Linux/5.15.0-79-generic

File hashes

Hashes for stormtrooper-0.3.2-py3-none-any.whl
Algorithm Hash digest
SHA256 fc5a46c3ebf441133340c8fe15f3a99c58275f75738f5e3e6838e4592e887293
MD5 0539932fc529e7c42d8f5a549d18df7c
BLAKE2b-256 09912a587093a6c8bc6f872dc59d90bbfe612252596b2d80321f8858265b9451

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page