Skip to main content

Efficient and scalable fine-tuning of Transformers with adapters

Project description

Self-Contained adapters Library

This branch disentangles adapter-transformers from HF Transformers and adds Transformers as an external dependency.

Breaking changes

  • All adapter-related classes now have to imported via adapters namespace, e.g.:
    from adapters import BertAdapterModel
    # ...
    
  • Built-in HF model classes can be adapted for usage with adapters via a wrapper method, e.g.:
    import adapters
    from transformers import BertModel
    
    model = BertModel.from_pretrained("bert-base-uncased")
    adapters.init(model)
    

Model support

  • Albert
  • Bart
  • BEiT
  • Bert
  • Bert Generation
  • CLIP
  • Deberta
  • Deberta V2
  • DistilBert
  • Encoder-Decoder
  • GPT-2
  • GPT-J
  • MBart
  • Roberta
  • T5
  • ViT
  • XLM-R

TODO

Features not (yet) working:

  • Loading model + adapter checkpoints using HF classes
  • Text generation with adapters (hacked working version)
  • Parallel generation with adapters
  • Using Transformers pipelines with adapters
  • Using HF language modeling classes with invertible adapters

Tasks to do for first usable version:

  • Remove utils folder and use utils of HF
  • Make all tests passing
  • Update example scripts w. breaking changes
  • Update docs w. breaking changes
  • Update contributing guides for new code structure

adapters

A friendly fork of HuggingFace's Transformers, adding Adapters to PyTorch language models

Tests GitHub PyPI

adapters is an extension of HuggingFace's Transformers library, integrating adapters into state-of-the-art language models by incorporating AdapterHub, a central repository for pre-trained adapter modules.

💡 Important: This library can be used as a drop-in replacement for HuggingFace Transformers and regularly synchronizes new upstream changes. Thus, most files in this repository are direct copies from the HuggingFace Transformers source, modified only with changes required for the adapter implementations.

Installation

adapters currently supports Python 3.7+ and PyTorch 1.3.1+. After installing PyTorch, you can install adapters from PyPI ...

pip install -U adapters

... or from source by cloning the repository:

git clone https://github.com/adapter-hub/adapters.git
cd adapters
pip install .

Getting Started

HuggingFace's great documentation on getting started with Transformers can be found here. adapters is fully compatible with Transformers.

To get started with adapters, refer to these locations:

  • Colab notebook tutorials, a series notebooks providing an introduction to all the main concepts of (adapter-)transformers and AdapterHub
  • https://docs.adapterhub.ml, our documentation on training and using adapters with adapters
  • https://adapterhub.ml to explore available pre-trained adapter modules and share your own adapters
  • Examples folder of this repository containing HuggingFace's example training scripts, many adapted for training adapters

Implemented Methods

Currently, adapters integrates all architectures and methods listed below:

Method Paper(s) Quick Links
Bottleneck adapters Houlsby et al. (2019)
Bapna and Firat (2019)
Quickstart, Notebook
AdapterFusion Pfeiffer et al. (2021) Docs: Training, Notebook
MAD-X,
Invertible adapters
Pfeiffer et al. (2020) Notebook
AdapterDrop Rücklé et al. (2021) Notebook
MAD-X 2.0,
Embedding training
Pfeiffer et al. (2021) Docs: Embeddings, Notebook
Prefix Tuning Li and Liang (2021) Docs
Parallel adapters,
Mix-and-Match adapters
He et al. (2021) Docs
Compacter Mahabadi et al. (2021) Docs
LoRA Hu et al. (2021) Docs
(IA)^3 Liu et al. (2022) Docs
UniPELT Mao et al. (2022) Docs

Supported Models

We currently support the PyTorch versions of all models listed on the Model Overview page in our documentation.

Citation

If you use this library for your work, please consider citing our paper AdapterHub: A Framework for Adapting Transformers:

@inproceedings{pfeiffer2020AdapterHub,
    title={AdapterHub: A Framework for Adapting Transformers},
    author={Pfeiffer, Jonas and
            R{\"u}ckl{\'e}, Andreas and
            Poth, Clifton and
            Kamath, Aishwarya and
            Vuli{\'c}, Ivan and
            Ruder, Sebastian and
            Cho, Kyunghyun and
            Gurevych, Iryna},
    booktitle={Proceedings of the 2020 Conference on Empirical Methods in Natural Language Processing: System Demonstrations},
    pages={46--54},
    year={2020}
}

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

adapters-0.0.0.dev20230804.tar.gz (152.6 kB view details)

Uploaded Source

Built Distribution

adapters-0.0.0.dev20230804-py3-none-any.whl (204.3 kB view details)

Uploaded Python 3

File details

Details for the file adapters-0.0.0.dev20230804.tar.gz.

File metadata

  • Download URL: adapters-0.0.0.dev20230804.tar.gz
  • Upload date:
  • Size: 152.6 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/1.13.0 pkginfo/1.4.2 requests/2.23.0 setuptools/40.5.0 requests-toolbelt/0.8.0 tqdm/4.46.0 CPython/3.6.8

File hashes

Hashes for adapters-0.0.0.dev20230804.tar.gz
Algorithm Hash digest
SHA256 f2ef57c5acbebabe5612ec2a581eeb15ac32f17aeadf46c519d95586127e5527
MD5 f52a67a20f43825b9023e0d3cf4623cb
BLAKE2b-256 93cce1fd0edb05ac94f7c283f5fa58d92ed456f7bdf4e61218549cbbbafacafe

See more details on using hashes here.

File details

Details for the file adapters-0.0.0.dev20230804-py3-none-any.whl.

File metadata

  • Download URL: adapters-0.0.0.dev20230804-py3-none-any.whl
  • Upload date:
  • Size: 204.3 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/1.13.0 pkginfo/1.4.2 requests/2.23.0 setuptools/40.5.0 requests-toolbelt/0.8.0 tqdm/4.46.0 CPython/3.6.8

File hashes

Hashes for adapters-0.0.0.dev20230804-py3-none-any.whl
Algorithm Hash digest
SHA256 a8df0fd84ef72487c6801ad9795e45dd36632c292fb442b0d9bd055e14e7d69c
MD5 7710f5ed8c9de9b61cff64ac9fa582d2
BLAKE2b-256 f159c8ffde30581255806e4456b4f5009eb87c87b961720579a289a61537bc9f

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page