prepare your dataset for finetuning LLMs
Project description
Dataset Preparation for Transformers Fine-tuning
The Dataset Prep Transformers package simplifies the process of preparing datasets for fine-tuning or training various large language models available in the Hugging Face Transformers library. Whether you're using a model from the Hugging Face repository or have your own dataset, this package streamlines the data integration for a seamless training experience.
Features
- Easily integrate your dataset with Hugging Face Transformers models for training or fine-tuning.
- Specify the model repository ID and dataset from the Hugging Face library to automatically fetch and configure the data.
- Seamlessly incorporate your custom dataset by providing it as input to the package.
Installation
You can install the package using pip:
pip install tydataprep
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
tydataprep-0.0.1.0.tar.gz
(3.0 kB
view hashes)
Built Distribution
Close
Hashes for tydataprep-0.0.1.0-py3-none-any.whl
Algorithm | Hash digest | |
---|---|---|
SHA256 | cd7e765b1f0c14234b26e2e3d5fe8a0ed116da65ff947037db7dd2d578889902 |
|
MD5 | bc2aa8569875fb0a25b78650b794c4c8 |
|
BLAKE2b-256 | 76b5b0701afc2562c3b7a61b49ab792bf68eee60b91a35c617f902a7785848c9 |