SAFT: Self-Attention Factor-Tuning for Parameter-Efficient Fine-Tuning
Project description
SAFT: Self-Attention Factor-Tuning
A highly efficient fine-tuning technique for large-scale neural networks.
Table of Contents
Quickstart
Easily install SAFT using pip and get started with a simple example.
Installation
pip install saft
Example Usage
from saft.saft import saft
if __name__ == "__main__":
saft_instance = saft(
model='vit_base_patch16_224',
num_classes=get_classes_num('oxford_flowers102'),
validation_interval=1,
rank=3,
scale=10
)
# Replace with your PyTorch DataLoader objects
# train_dl, test_dl = [your data in a pytorch dataloader]
# saft_instance.upload_data(train_dl, test_dl)
saft_instance.train(10)
trained_model = saft_instance.model
VTAB-1k Test
To run tests on the VTAB-1K dataset, follow these steps:
- Visit the SSF Data Preparation page to download the VTAB-1K dataset.
- Place the downloaded dataset folders in
<YOUR PATH>/SAFT/data/
.
Pretrained Model
For a quick start, download the pretrained ViT-B/16 model:
- Download ViT-B/16
- Place the downloaded model in
<YOUR PATH>/SAFT/ViT-B_16.npz
.
Results
Achieve remarkable performance with only 0.055 million trainable backbone parameters using ViT-B/16.
Project details
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
saft-1.2.0.tar.gz
(5.5 kB
view hashes)
Built Distribution
saft-1.2.0-py3-none-any.whl
(8.5 kB
view hashes)