A Large Language Model Fine-tuning package. The package uses a single line to fine-tune an LLM by taking care of all the boilerplate in the backend.
Project description
🔥 One Line LLM Tuner 🔥
Fine-tune any Large Language Model (LLM) available on Hugging Face in a single line. It is created by Suhas Bhairav.
Overview
one-line-llm-tuner
is a Python package designed to simplify the process of fine-tuning large language models (LLMs) like GPT-2, Llama-2, GPT-3 and more. With just one line of code, you can fine-tune a pre-trained model to your specific dataset. Consider it as a wrapper for transformers
library, just like how keras
is for tensorflow
.
Features
- Simple: Fine-tune models with minimal code.
- Supports Popular LLMs: Works with models from the
transformers
library, including GPT, BERT, and more. - Customizable: Advanced users can customize the fine-tuning process with additional parameters.
Installation
You can install one-line-llm-tuner
using pip:
pip install one-line-llm-tuner
Usage
The PyPI package can be used in the following way after installation.
from one_line_llm_tuner.tuner import llm_tuner
fine_tune_obj = llm_tuner.FineTuneModel()
fine_tune_obj.fine_tune_model(input_file_path="train.txt")
fine_tune_obj.predict_text("Elon musk founded Spacex in ")
If you want to modify the default values such as type of model used, tokenizer and more, use the following code.
from one_line_llm_tuner.tuner import llm_tuner
fine_tune_obj = llm_tuner.FineTuneModel(model_name="gpt2",
test_size=0.3,
training_dataset_filename="train_dataset.txt",
testing_dataset_filename="test_dataset.txt",
tokenizer_truncate=True,
tokenizer_padding=True,
output_dir="./results",
num_train_epochs=2,
logging_steps=500,
save_steps=500,
per_device_train_batch_size=128,
per_device_eval_batch_size=128,
max_output_length=100,
num_return_sequences=1,
skip_special_tokens=True,)
fine_tune_obj.fine_tune_model(input_file_path="train.txt")
fine_tune_obj.predict_text("Elon musk founded Spacex in ")
Contributing
We welcome contributions! Please see the contributing guide for more details.
License
This project is licensed under the terms of the MIT license. See the LICENSE file for details.
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
File details
Details for the file one_line_llm_tuner-0.0.15.tar.gz
.
File metadata
- Download URL: one_line_llm_tuner-0.0.15.tar.gz
- Upload date:
- Size: 6.0 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/5.1.1 CPython/3.12.3
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 | 282fd46c7ae195da16ed3aa54c980126251acd8c63f96c3ed536e96ce25c9a01 |
|
MD5 | c88f253dd77dc0460fe5d83509147ae9 |
|
BLAKE2b-256 | 5b4ac38cfb6cd5d1ebdf4b42ebe6a9a9c86b41565089f1cedc0cf9ea4238a60e |
File details
Details for the file one_line_llm_tuner-0.0.15-py3-none-any.whl
.
File metadata
- Download URL: one_line_llm_tuner-0.0.15-py3-none-any.whl
- Upload date:
- Size: 6.1 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/5.1.1 CPython/3.12.3
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 | d5899368edb9f069c6abbfd3269ae3f3a316662ad06a31b8ebb0dc4d2a3e1306 |
|
MD5 | 48c196edf16455ea2996ac385a109853 |
|
BLAKE2b-256 | 97e1a71d245cea9de8f7dc67a16a14c5731ef3d9031083888ded9858bfab5cb6 |