Skip to main content

A Unified Library for Parameter-Efficient and Modular Transfer Learning

Project description

Note: This repository holds the codebase of the Adapters library, which has replaced adapter-transformers. For the legacy codebase, go to: https://github.com/adapter-hub/adapter-transformers-legacy.

Adapters

A Unified Library for Parameter-Efficient and Modular Transfer Learning

Tests GitHub PyPI

adapters is an add-on to HuggingFace's Transformers library, integrating adapters into state-of-the-art language models by incorporating AdapterHub, a central repository for pre-trained adapter modules.

Installation

adapters currently supports Python 3.8+ and PyTorch 1.10+. After installing PyTorch, you can install adapters from PyPI ...

pip install -U adapters

... or from source by cloning the repository:

git clone https://github.com/adapter-hub/adapters.git
git checkout adapters
cd adapters
pip install .

Quick Tour

Load pre-trained adapters:

from adapters import AutoAdapterModel
from transformers import AutoTokenizer

model = AutoAdapterModel.from_pretrained("roberta-base")
tokenizer = AutoTokenizer.from_pretrained("roberta-base")

model.load_adapter("AdapterHub/roberta-base-pf-imdb", source="hf", set_active=True)

print(model(**tokenizer("This works great!", return_tensors="pt")).logits)

Learn More

Adapt existing model setups:

import adapters
from transformers import AutoModelForSequenceClassification

model = AutoModelForSequenceClassification.from_pretrained("t5-base")

adapters.init(model)

model.add_adapter("my_lora_adapter", config="lora")
model.train_adapter("my_lora_adapter")

# Your regular training loop...

Learn More

Flexibly configure adapters:

from adapters import ConfigUnion, PrefixTuningConfig, ParBnConfig, AutoAdapterModel

model = AutoAdapterModel.from_pretrained("microsoft/deberta-v3-base")

adapter_config = ConfigUnion(
    PrefixTuningConfig(prefix_length=20),
    ParBnConfig(reduction_factor=4),
)
model.add_adapter("my_adapter", config=adapter_config, set_active=True)

Learn More

Easily compose adapters in a single model:

from adapters import AdapterSetup, AutoAdapterModel
import adapters.composition as ac

model = AutoAdapterModel.from_pretrained("roberta-base")

qc = model.load_adapter("AdapterHub/roberta-base-pf-trec")
sent = model.load_adapter("AdapterHub/roberta-base-pf-imdb")

with AdapterSetup(ac.Parallel(qc, sent)):
    print(model(**tokenizer("What is AdapterHub?", return_tensors="pt")))

Learn More

Useful Resources

HuggingFace's great documentation on getting started with Transformers can be found here. adapters is fully compatible with Transformers.

To get started with adapters, refer to these locations:

  • Colab notebook tutorials, a series notebooks providing an introduction to all the main concepts of (adapter-)transformers and AdapterHub
  • https://docs.adapterhub.ml, our documentation on training and using adapters with adapters
  • https://adapterhub.ml to explore available pre-trained adapter modules and share your own adapters
  • Examples folder of this repository containing HuggingFace's example training scripts, many adapted for training adapters

Implemented Methods

Currently, adapters integrates all architectures and methods listed below:

Method Paper(s) Quick Links
Bottleneck adapters Houlsby et al. (2019)
Bapna and Firat (2019)
Quickstart, Notebook
AdapterFusion Pfeiffer et al. (2021) Docs: Training, Notebook
MAD-X,
Invertible adapters
Pfeiffer et al. (2020) Notebook
AdapterDrop Rücklé et al. (2021) Notebook
MAD-X 2.0,
Embedding training
Pfeiffer et al. (2021) Docs: Embeddings, Notebook
Prefix Tuning Li and Liang (2021) Docs
Parallel adapters,
Mix-and-Match adapters
He et al. (2021) Docs
Compacter Mahabadi et al. (2021) Docs
LoRA Hu et al. (2021) Docs
(IA)^3 Liu et al. (2022) Docs
UniPELT Mao et al. (2022) Docs
Prompt Tuning Lester et al. (2021) Docs

Supported Models

We currently support the PyTorch versions of all models listed on the Model Overview page in our documentation.

Developing & Contributing

To get started with developing on Adapters yourself and learn more about ways to contribute, please see https://docs.adapterhub.ml/contributing.html.

Citation

If you use the Adapters library in your work, please consider citing our library paper: Adapters: A Unified Library for Parameter-Efficient and Modular Transfer Learning

@misc{poth2023adapters,
      title={Adapters: A Unified Library for Parameter-Efficient and Modular Transfer Learning}, 
      author={Clifton Poth and Hannah Sterz and Indraneil Paul and Sukannya Purkayastha and Leon Engländer and Timo Imhof and Ivan Vulić and Sebastian Ruder and Iryna Gurevych and Jonas Pfeiffer},
      year={2023},
      eprint={2311.11077},
      archivePrefix={arXiv},
      primaryClass={cs.CL}
}

Alternatively, for the predecessor adapter-transformers, the Hub infrastructure and adapters uploaded by the AdapterHub team, please consider citing our initial paper: AdapterHub: A Framework for Adapting Transformers

@inproceedings{pfeiffer2020AdapterHub,
    title={AdapterHub: A Framework for Adapting Transformers},
    author={Pfeiffer, Jonas and
            R{\"u}ckl{\'e}, Andreas and
            Poth, Clifton and
            Kamath, Aishwarya and
            Vuli{\'c}, Ivan and
            Ruder, Sebastian and
            Cho, Kyunghyun and
            Gurevych, Iryna},
    booktitle={Proceedings of the 2020 Conference on Empirical Methods in Natural Language Processing: System Demonstrations},
    pages={46--54},
    year={2020}
}

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

adapters-0.1.0.tar.gz (177.8 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

adapters-0.1.0-py3-none-any.whl (229.9 kB view details)

Uploaded Python 3

File details

Details for the file adapters-0.1.0.tar.gz.

File metadata

  • Download URL: adapters-0.1.0.tar.gz
  • Upload date:
  • Size: 177.8 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/4.0.2 CPython/3.8.18

File hashes

Hashes for adapters-0.1.0.tar.gz
Algorithm Hash digest
SHA256 2815e7210963429f7ea0cdb902fc9ca73db9eb42fe53a62fb2c4e319d0cff9fa
MD5 f3d84aa37113cec1a5a5511696e57db8
BLAKE2b-256 667f6e9d4d40d7fdde24ed28293784e0edc0fdb6524d00126be94848728c1730

See more details on using hashes here.

File details

Details for the file adapters-0.1.0-py3-none-any.whl.

File metadata

  • Download URL: adapters-0.1.0-py3-none-any.whl
  • Upload date:
  • Size: 229.9 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/4.0.2 CPython/3.8.18

File hashes

Hashes for adapters-0.1.0-py3-none-any.whl
Algorithm Hash digest
SHA256 c828dabaf8da60881847afcba6ebea0ae89cdd443d243cd1a866bf05366570ad
MD5 5ebe626d7486ef21812ff3282ae7800e
BLAKE2b-256 a40195ad86c896b5917d892f984f07cfd0c4c9ed93c8029647b55394072e69ba

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page