Reduce cognitive load when finetuning transformers 🫠
Project description
Finetuners: Reduce cognitive load when finetuning transformers 🥴
Catchy intro describing the value proposition of the finetuners package.
Installation
pip install finetuners
Example
import pathlib
from finetuners import (
FinetunerArguments,
FinetunerForTextClassification,
FinetunersDataset,
)
# load dataset
dataset = FinetunersDataset.from_path(
pathlib.Path(__file__).parents[1].joinpath("datasets", "angry-tweets")
)
# define arguments
args = FinetunerArguments(
model_name="awesome_model",
pretrained_model_name_or_path="Maltehb/danish-bert-botxo",
training_args={
"output_dir": "./runs/",
"learning_rate": 5e-5,
},
)
# init finetuner
finetuner = FinetunerForTextClassification(
dataset=dataset,
args=args,
)
finetuner.finetune()
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
finetuners-0.0.1.tar.gz
(5.9 kB
view hashes)
Built Distribution
Close
Hashes for finetuners-0.0.1-py3-none-any.whl
Algorithm | Hash digest | |
---|---|---|
SHA256 | c69e1a5a2c61d5526f3f4ae97542113d93f942a44c575d694628d3df772e6e5b |
|
MD5 | 78207c08131aef6e9e7e3a9226718361 |
|
BLAKE2b-256 | 1d35ee3068f2dae7d1ac1b358df94c1577341321e5a397574a139e7bc69006f9 |