prepare your dataset for finetuning LLMs
Project description
Dataset Preparation for Transformers Fine-tuning
The Dataset Prep Transformers package simplifies the process of preparing datasets for fine-tuning or training various large language models available in the Hugging Face Transformers library. Whether you're using a model from the Hugging Face repository or have your own dataset, this package streamlines the data integration for a seamless training experience.
Features
- Easily integrate your dataset with Hugging Face Transformers models for training or fine-tuning.
- Specify the model repository ID and dataset from the Hugging Face library to automatically fetch and configure the data.
- Seamlessly incorporate your custom dataset by providing it as input to the package.
Installation
You can install the package using pip:
pip install tydataprep
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
yDataPrep-0.0.1.0.tar.gz
(3.0 kB
view hashes)
Built Distribution
Close
Hashes for yDataPrep-0.0.1.0-py3-none-any.whl
Algorithm | Hash digest | |
---|---|---|
SHA256 | 5ef12990be5c981e5405ddb42df0f015fa9475cb4c9f6651ffb3a53732809e90 |
|
MD5 | c3864560dabd418d3db7c81c511eecde |
|
BLAKE2b-256 | 79718f0f1aa6d5560850a3f276c5810df484f8e0a588247780b1d6424b1f3ea0 |