Skip to main content

A friendly fork of HuggingFace's Transformers, adding Adapters to PyTorch language models

Project description

adapter-transformers

A friendly fork of HuggingFace's Transformers, adding Adapters to PyTorch language models

Tests GitHub PyPI

adapter-transformers is an extension of HuggingFace's Transformers library, integrating adapters into state-of-the-art language models by incorporating AdapterHub, a central repository for pre-trained adapter modules.

💡 Important: This library can be used as a drop-in replacement for HuggingFace Transformers and regularly synchronizes new upstream changes. Thus, most files in this repository are direct copies from the HuggingFace Transformers source, modified only with changes required for the adapter implementations.

Installation

adapter-transformers currently supports Python 3.7+ and PyTorch 1.3.1+. After installing PyTorch, you can install adapter-transformers from PyPI ...

pip install -U adapter-transformers

... or from source by cloning the repository:

git clone https://github.com/adapter-hub/adapter-transformers.git
cd adapter-transformers
pip install .

Getting Started

HuggingFace's great documentation on getting started with Transformers can be found here. adapter-transformers is fully compatible with Transformers.

To get started with adapters, refer to these locations:

  • Colab notebook tutorials, a series notebooks providing an introduction to all the main concepts of (adapter-)transformers and AdapterHub
  • https://docs.adapterhub.ml, our documentation on training and using adapters with adapter-transformers
  • https://adapterhub.ml to explore available pre-trained adapter modules and share your own adapters
  • Examples folder of this repository containing HuggingFace's example training scripts, many adapted for training adapters

Implemented Methods

Currently, adapter-transformers integrates all architectures and methods listed below:

Method Paper(s) Quick Links
Bottleneck adapters Houlsby et al. (2019)
Bapna and Firat (2019)
Quickstart, Notebook
AdapterFusion Pfeiffer et al. (2021) Docs: Training, Notebook
MAD-X,
Invertible adapters
Pfeiffer et al. (2020) Notebook
AdapterDrop Rücklé et al. (2021) Notebook
MAD-X 2.0,
Embedding training
Pfeiffer et al. (2021) Docs: Embeddings, Notebook
Prefix Tuning Li and Liang (2021) Docs
Parallel adapters,
Mix-and-Match adapters
He et al. (2021) Docs
Compacter Mahabadi et al. (2021) Docs
LoRA Hu et al. (2021) Docs
(IA)^3 Liu et al. (2022) Docs
UniPELT Mao et al. (2022) Docs

Supported Models

We currently support the PyTorch versions of all models listed on the Model Overview page in our documentation.

Citation

If you use this library for your work, please consider citing our paper AdapterHub: A Framework for Adapting Transformers:

@inproceedings{pfeiffer2020AdapterHub,
    title={AdapterHub: A Framework for Adapting Transformers},
    author={Pfeiffer, Jonas and
            R{\"u}ckl{\'e}, Andreas and
            Poth, Clifton and
            Kamath, Aishwarya and
            Vuli{\'c}, Ivan and
            Ruder, Sebastian and
            Cho, Kyunghyun and
            Gurevych, Iryna},
    booktitle={Proceedings of the 2020 Conference on Empirical Methods in Natural Language Processing: System Demonstrations},
    pages={46--54},
    year={2020}
}

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

adapter-transformers-3.1.0.tar.gz (4.0 MB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

adapter_transformers-3.1.0-py3-none-any.whl (4.8 MB view details)

Uploaded Python 3

File details

Details for the file adapter-transformers-3.1.0.tar.gz.

File metadata

  • Download URL: adapter-transformers-3.1.0.tar.gz
  • Upload date:
  • Size: 4.0 MB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/4.0.1 CPython/3.10.4

File hashes

Hashes for adapter-transformers-3.1.0.tar.gz
Algorithm Hash digest
SHA256 f80756f1dc4a23bae73fd68b5edd86bd780c3435d693dd0889b585dc865d16dd
MD5 521a7c511e0ce0fcff53ea65f914b89c
BLAKE2b-256 6e2d3cdf1473c2b47caef4b80e761b05f039ccd8a1ceea4375b92188e3e5432d

See more details on using hashes here.

File details

Details for the file adapter_transformers-3.1.0-py3-none-any.whl.

File metadata

File hashes

Hashes for adapter_transformers-3.1.0-py3-none-any.whl
Algorithm Hash digest
SHA256 d89c2c10ecd870bfce77ac483fef6b5f87d8683a9f91278585ca6023f67f49a9
MD5 52b936ac2f333cf52a6bc1902ab0db36
BLAKE2b-256 2607fafa38908581ec2c916762cfbdd2d321fd0e34885639318716d349559d6e

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page