Skip to main content

Unified Parameter-Efficient Fine-Tuning of 100+ LLMs

Project description

# PEFT Factory


Parameter-Efficient Fine-Tuning Made Easy

PEFT-Factory is a fork of LLaMA-Factory ❤️, enhanced with an easy-to-use PEFT interface, support for HuggingFace PEFT methods, and curated datasets for benchmarking PEFT approaches.

📄 System Demonstration Paper  |  🎥 Demo Video  |  🏛️ EACL 2026

🏆 PEFT-Factory was presented at EACL 2026 (19th Conference of the European Chapter of the Association for Computational Linguistics, Rabat, Morocco) as a System Demonstration.


Supported Methods

PEFT Method Supported Backend
LoRA (including variants) 🦙 LLaMA-Factory
OFT 🦙 LLaMA-Factory
Prefix Tuning 🤗 HuggingFace PEFT
Prompt Tuning 🤗 HuggingFace PEFT
P-Tuning 🤗 HuggingFace PEFT
P-Tuning v2 🤗 HuggingFace PEFT
MPT 🤗 HuggingFace PEFT
IA³ 🤗 HuggingFace PEFT
LNTuning 🤗 HuggingFace PEFT
Bottleneck Adapter 🤖 AdapterHub
Parallel Adapter 🤖 AdapterHub
SeqBottleneck Adapter 🤖 AdapterHub
SVFT ⚙️ Custom
BitFit ⚙️ Custom

Usage

This section provides instructions on how to install PEFT-Factory, download the necessary data and methods, and run training using either the command line or the web UI.

Quickstart

For a video walkthrough, visit the PEFT-Factory Demonstration Video.

# Install the package
pip install peftfactory

# Download the repository, which contains data, PEFT methods, and examples
git clone https://github.com/kinit-sk/PEFT-Factory.git && cd PEFT-Factory

# Start the web UI
pf webui

Alternatively, you can run training directly from the command line:

# Install the package
pip install peftfactory

# Download the repository, which contains data, PEFT methods, and examples
git clone https://github.com/kinit-sk/PEFT-Factory.git && cd PEFT-Factory

Set Environment Variables for envsubst

Define the variables that will be substituted into the training config template:

TIMESTAMP=`date +%s`
OUTPUT_DIR="saves/bitfit/llama-3.2-1b-instruct/train_wsc_${TIMESTAMP}"
DATASET="wsc"
SEED=123
WANDB_PROJECT="peft-factory-train-bitfit"
WANDB_NAME="bitfit_llama-3.2-1b-instruct_train_wsc"

mkdir -p "${OUTPUT_DIR}"

export OUTPUT_DIR DATASET SEED WANDB_PROJECT WANDB_NAME

Apply the Config Template

The envsubst utility replaces occurrences of environment variables in the template file with their current values:

envsubst < examples/peft/bitfit/llama-3.2-1b-instruct/train.yaml > ${OUTPUT_DIR}/train.yaml

Run Training

peftfactory-cli train ${OUTPUT_DIR}/train.yaml

Installation

PEFT-Factory can be installed in several ways: directly from PyPI for the latest release, or built from source for the development version.

From PyPI (Recommended)

pip install peftfactory

From Source

1. Clone the repository:

git clone git@github.com:kinit-sk/PEFT-Factory.git

2. Build the wheel package:

make build

3. Install with pip:

pip install dist/[name of the built package].whl

Installing DeepSpeed

DeepSpeed is required for evaluation and computation of the PSCP metric.

pip install deepspeed

Note: You may encounter an error about the CUDA_HOME environment variable not being set. The fix depends on your environment:

Conda

conda install -c nvidia cuda-compiler

Standard virtualenv / pyenv

You will need to install CUDA with the nvcc compiler at the OS level. Instructions vary by operating system — consult your distribution's documentation. For example, on Arch Linux:

# Arch Linux example — the exact command differs per OS
sudo pacman -S cuda

Data and Methods

To download the datasets, PEFT method implementations, and example configs for training, clone the repository from GitHub:

git clone https://github.com/kinit-sk/PEFT-Factory.git && cd PEFT-Factory

Running Training

From the Command Line

pf train [path to config file].yaml

Using the Web UI

pf webui

Citation

If you use PEFT-Factory in your research, please cite our EACL 2026 System Demonstration paper:

@inproceedings{belanec-etal-2026-peft-factory,
    title = "{PEFT}-Factory: Unified Parameter-Efficient Fine-Tuning of Autoregressive Large Language Models",
    author = "Belanec, Robert  and
      Srba, Ivan  and
      Bielikova, Maria",
    editor = "Croce, Danilo  and
      Leidner, Jochen  and
      Moosavi, Nafise Sadat",
    booktitle = "Proceedings of the 19th Conference of the {E}uropean Chapter of the {A}ssociation for {C}omputational {L}inguistics (Volume 3: System Demonstrations)",
    month = mar,
    year = "2026",
    address = "Rabat, Morocco",
    publisher = "Association for Computational Linguistics",
    url = "https://aclanthology.org/2026.eacl-demo.15/",
    doi = "10.18653/v1/2026.eacl-demo.15",
    pages = "188--202",
    ISBN = "979-8-89176-382-1",
    abstract = "Parameter-Efficient Fine-Tuning (PEFT) methods address the increasing size of Large Language Models (LLMs). Currently, many newly introduced PEFT methods are challenging to replicate, deploy, or compare with one another. To address this, we introduce PEFT-Factory, a unified framework for efficient fine-tuning LLMs using both off-the-shelf and custom PEFT methods. While its modular design supports extensibility, it natively provides a representative set of 19 PEFT methods, 27 classification and text generation datasets addressing 12 tasks, and both standard and PEFT-specific evaluation metrics. As a result, PEFT-Factory provides a ready-to-use, controlled, and stable environment, improving replicability and benchmarking of PEFT methods. PEFT-Factory is a downstream framework that originates from the popular LLaMA-Factory, and is publicly available at https://github.com/kinit-sk/PEFT-Factory."
}

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

peftfactory-0.9.4.8.tar.gz (230.1 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

peftfactory-0.9.4.8-py3-none-any.whl (317.1 kB view details)

Uploaded Python 3

File details

Details for the file peftfactory-0.9.4.8.tar.gz.

File metadata

  • Download URL: peftfactory-0.9.4.8.tar.gz
  • Upload date:
  • Size: 230.1 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.13.7

File hashes

Hashes for peftfactory-0.9.4.8.tar.gz
Algorithm Hash digest
SHA256 f48bee33332527fdc5b82d961460d04fbf105cd267f0ddaa155b53218aa4a43a
MD5 077d9c14f4fa5c9640cf5b641a3cd508
BLAKE2b-256 9af2cf2c8b02311da6c3fa09efb9fb2b025674e9c723436e52e7c851a02b7a6f

See more details on using hashes here.

Provenance

The following attestation bundles were made for peftfactory-0.9.4.8.tar.gz:

Publisher: publish.yml on kinit-sk/PEFT-Factory

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file peftfactory-0.9.4.8-py3-none-any.whl.

File metadata

  • Download URL: peftfactory-0.9.4.8-py3-none-any.whl
  • Upload date:
  • Size: 317.1 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.13.7

File hashes

Hashes for peftfactory-0.9.4.8-py3-none-any.whl
Algorithm Hash digest
SHA256 bb04e00af48e5064f25f5e29e75d98ab580e671457286e7bbee230a949a7e26c
MD5 e43cadedcd0a5839772b72baa032fcd1
BLAKE2b-256 6078c6885dd501bc89742fdf33ffa49171b9baab2f1b6d101536bf9c0d894521

See more details on using hashes here.

Provenance

The following attestation bundles were made for peftfactory-0.9.4.8-py3-none-any.whl:

Publisher: publish.yml on kinit-sk/PEFT-Factory

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page