PyTorch implementation of low-rank adaptation (LoRA), a parameter-efficient approach to adapt a large pre-trained deep learning model which obtains performance on-par with full fine-tuning.
Project description
LowRank Adapters
This libary implements a series of Low Rank Adapters to apply to torch models in a simple manner without having to redefine your model. Official Repository
You can choose to only store the updated layers as shown below
import timm
import torch
from lora_adapters import LoraConv2d, apply_adapter, mark_only_lora_as_trainable, lora_state_dict
model = timm.create_model('resnet50', pretrained=True)
model = apply_adapter(model, LoraConv2d, rank=16)
model = mark_only_lora_as_trainable(model, bias='lora_only')
... Custom Training Loop ...
updates = lora_state_dict(model, bias='lora_only')
torch.save(updates, 'updates.ckpt')
import timm
import torch
from lora_adapters import LoraConv2d, apply_adapter, mark_lora_as_trainable, lora_state_dict
model = timm.create_model('resnet50', pretrained=True)
updates = torch.load('updates.ckpt')
model.load_state_dict(updates, strict=False)
Or you can reverse the lora adaptations and save the full state dictionary
import timm
import torch
from lora_adapters import LoraConv2d, apply_adapter, mark_only_lora_as_trainable, undo_lora
model = timm.create_model('resnet50', pretrained=True)
model = apply_adapter(model, LoraConv2d, rank=16)
model = mark_only_lora_as_trainable(model, bias='lora_only')
... Custom Training Loop ...
model = undo_lora(model)
torch.save(model.state_dict(), 'model.ckpt')
Warnings
The functions that adapt the model change them in place, so the below code won't work as expected, model
and lora_model
will be the same
import timm
from lora_adapters import LoraConv2d, apply_adapter
model = timm.create_model('resnet50', pretrained=True)
lora_model = apply_adapter(model, LoraConv2d, rank=16)
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
File details
Details for the file lora_adapters-0.1.4.tar.gz
.
File metadata
- Download URL: lora_adapters-0.1.4.tar.gz
- Upload date:
- Size: 9.9 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/4.0.1 CPython/3.11.3
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 | 2914628ad8b39dbb7b72eae3469d71d4368bc9818685c6acdba2f3383b9c9f65 |
|
MD5 | e4cf896047092d70fb1b1662ab226524 |
|
BLAKE2b-256 | e1d4b46977de1f0184ba5ec9e73d40c09986b34ec1b2ddbfe2229fa60f54e27e |
File details
Details for the file lora_adapters-0.1.4-py3-none-any.whl
.
File metadata
- Download URL: lora_adapters-0.1.4-py3-none-any.whl
- Upload date:
- Size: 7.0 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/4.0.1 CPython/3.11.3
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 | d81f2f74905b30b9adcffc73174a67b2cb9e3cacc823eefc3bfadbe814d6b42b |
|
MD5 | e6b39daf4ac342bfed4ace61189874ea |
|
BLAKE2b-256 | cc1afb48b76028cb7ec8a0a12de76ca686eb9f76e6b15234720c0e71717a91e8 |