Skip to main content

Reduce cognitive load when finetuning transformers 🫠

Project description

Finetuners: Reduce cognitive load when finetuning transformers 🥴

Catchy intro describing the value proposition of the finetuners package.

Installation

pip install finetuners

Example

import pathlib

from finetuners import (
    FinetunerArguments,
    FinetunerForTextClassification,
    FinetunersDataset,
)

# load dataset
dataset = FinetunersDataset.from_path(
    pathlib.Path(__file__).parents[1].joinpath("datasets", "angry-tweets")
)

# define arguments
args = FinetunerArguments(
    model_name="awesome_model",
    pretrained_model_name_or_path="Maltehb/danish-bert-botxo",
    training_args={
        "output_dir": "./runs/",
        "learning_rate": 5e-5,
    },
)

# init finetuner
finetuner = FinetunerForTextClassification(
    dataset=dataset,
    args=args,
)


finetuner.finetune()

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

finetuners-0.0.1.tar.gz (5.9 kB view hashes)

Uploaded Source

Built Distribution

finetuners-0.0.1-py3-none-any.whl (7.0 kB view hashes)

Uploaded Python 3

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page