Skip to main content

SAFT: Self-Attention Factor-Tuning for Parameter-Efficient Fine-Tuning

Project description

SAFT: Self-Attention Factor-Tuning

A highly efficient fine-tuning technique for large-scale neural networks.

PyPI Version

Table of Contents

Quickstart

Easily install SAFT using pip and get started with a simple example.

Installation

pip install saft

Example Usage

from saft.saft import saft

if __name__ == "__main__":
    saft_instance = saft(
        model='vit_base_patch16_224',
        num_classes=get_classes_num('oxford_flowers102'),
        validation_interval=1,
        rank=3,
        scale=10
    )
    # Replace with your PyTorch DataLoader objects
    # train_dl, test_dl = [your data in a pytorch dataloader]
    # saft_instance.upload_data(train_dl, test_dl)
    
    saft_instance.train(10)
    trained_model = saft_instance.model

VTAB-1k Test

To run tests on the VTAB-1K dataset, follow these steps:

  1. Visit the SSF Data Preparation page to download the VTAB-1K dataset.
  2. Place the downloaded dataset folders in <YOUR PATH>/SAFT/data/.

Pretrained Model

For a quick start, download the pretrained ViT-B/16 model:

Results

Achieve remarkable performance with only 0.055 million trainable backbone parameters using ViT-B/16.

Performance Results

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

saft-1.2.0.tar.gz (5.5 kB view hashes)

Uploaded Source

Built Distribution

saft-1.2.0-py3-none-any.whl (8.5 kB view hashes)

Uploaded Python 3

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page