SingLoRA: Single Low-Rank Adaptation for PEFT
Project description
PEFT-SingLoRA
SingLoRA (Single Low-Rank Adaptation) is an efficient alternative to traditional LoRA that uses a single low-rank matrix instead of two, reducing parameters while maintaining performance. This package provides a PEFT-compatible implementation of SingLoRA based on kyegomez's implementation.
Key Features
- 🚀 50% fewer parameters than standard LoRA
- 🔧 Fully compatible with PEFT ecosystem
- 📊 Mathematically equivalent to standard LoRA
- 🎯 Easy integration with existing PEFT workflows
Installation
pip install peft-singlora
Quick Start
Here is an extremely simplified training loop example to help understand when to update the global step:
from transformers import AutoModelForCausalLM
from peft import LoraConfig, get_peft_model
from peft_singlora import setup_singlora, update_singlora_global_step
# Load your model
model = AutoModelForCausalLM.from_pretrained("your-model-name")
# Setup SingLoRA (this registers it with PEFT)
setup_singlora()
# Configure LoRA as usual - it will use SingLoRA under the hood
config = LoraConfig(
r=8,
lora_alpha=32,
target_modules=["q_proj", "v_proj"],
lora_dropout=0.1,
)
# Create PEFT model - will automatically use SingLoRA for linear layers
peft_model = get_peft_model(model, config)
# Training loop with proper step tracking
gradient_accumulation_steps = 4
global_step = 0
for epoch in range(num_epochs):
for batch_idx, batch in enumerate(dataloader):
# Forward pass
outputs = peft_model(**batch)
loss = outputs.loss / gradient_accumulation_steps
# Backward pass
loss.backward()
# Update weights and global step every N batches
if (batch_idx + 1) % gradient_accumulation_steps == 0:
optimizer.step()
optimizer.zero_grad()
# Update SingLoRA step counter after optimizer step
update_singlora_global_step(peft_model, global_step)
global_step += 1
How It Works
Traditional LoRA uses two matrices (A and B) for the low-rank decomposition:
W = W_0 + BA
SingLoRA uses a single matrix A with a symmetric decomposition:
W = W_0 + α/r * A @ A^T
This reduces trainable parameters from 2 * d * r to d * r while maintaining the same expressive power.
Advanced Usage
Custom Configuration
from peft_singlora import SingLoRAConfig
config = SingLoRAConfig(
r=16,
lora_alpha=32,
target_modules=["q_proj", "v_proj", "k_proj"],
lora_dropout=0.1,
ramp_up_steps=1000, # Gradually increase adaptation strength
)
Manual Integration
import torch.nn as nn
from peft_singlora import Linear as SingLoRALinear
# Register custom module mapping
custom_module_mapping = {nn.Linear: SingLoRALinear}
config._register_custom_module(custom_module_mapping)
Examples
Check out the examples/ directory for:
- Basic usage with different model architectures
- Fine-tuning examples with real datasets
- Performance comparisons with standard LoRA
Citation
If you use SingLoRA in your research, please cite:
@misc{bensaïd2025singloralowrankadaptation,
title={SingLoRA: Low Rank Adaptation Using a Single Matrix},
author={David Bensaïd and Noam Rotstein and Roy Velich and Daniel Bensaïd and Ron Kimmel},
year={2025},
eprint={2507.05566},
archivePrefix={arXiv},
primaryClass={cs.AI},
url={https://arxiv.org/abs/2507.05566},
}
Contributing
We welcome contributions! Please see our Contributing Guidelines for details.
License
This project is licensed under the BSD 2-Clause License - see the LICENSE file for details.
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file peft_singlora-0.2.0.tar.gz.
File metadata
- Download URL: peft_singlora-0.2.0.tar.gz
- Upload date:
- Size: 12.3 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.1.0 CPython/3.10.18
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
f90eda649ff354ae9aa6777e6761e98c8eb56a7b265ba1e734203a318c2716b7
|
|
| MD5 |
c74726cc51c815ba8f4b1119b3891d52
|
|
| BLAKE2b-256 |
7cee6d916a01bf2ce1f8f306d0133c4165170efced887da386c22266a81da683
|
File details
Details for the file peft_singlora-0.2.0-py3-none-any.whl.
File metadata
- Download URL: peft_singlora-0.2.0-py3-none-any.whl
- Upload date:
- Size: 9.6 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.1.0 CPython/3.10.18
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
1ddc546eb18f9d6f4e0d80a56f224a0c852de5b72234408a58e1ca3669f9b703
|
|
| MD5 |
c300494d58614d692e12b3515f041a1e
|
|
| BLAKE2b-256 |
65705450aea2b71700ed110c36b54ed3be8b0fbe639a1663f4ad961c8d47a865
|