A friendly fork of Huggingface's Transformers, adding Adapters to PyTorch language models
Project description
adapter-transformers
A friendly fork of HuggingFace's Transformers, adding Adapters to PyTorch language models
adapter-transformers
is an extension of HuggingFace's Transformers library, integrating adapters into state-of-the-art language models by incorporating AdapterHub, a central repository for pre-trained adapter modules.
💡 Important: This library can be used as a drop-in replacement for HuggingFace Transformers and regularly synchronizes new upstream changes. Thus, most files in this repository are direct copies from the HuggingFace Transformers source, modified only with changes required for the adapter implementations.
Installation
adapter-transformers
currently supports Python 3.6+ and PyTorch 1.3.1+.
After installing PyTorch, you can install adapter-transformers
from PyPI ...
pip install -U adapter-transformers
... or from source by cloning the repository:
git clone https://github.com/adapter-hub/adapter-transformers.git
cd adapter-transformers
pip install .
Getting Started
HuggingFace's great documentation on getting started with Transformers can be found here. adapter-transformers
is fully compatible with Transformers.
To get started with adapters, refer to these locations:
- Colab notebook tutorials, a series notebooks providing an introduction to all the main concepts of (adapter-)transformers and AdapterHub
- https://docs.adapterhub.ml, our documentation on training and using adapters with adapter-transformers
- https://adapterhub.ml to explore available pre-trained adapter modules and share your own adapters
- Examples folder of this repository containing HuggingFace's example training scripts, many adapted for training adapters
Citation
If you use this library for your work, please consider citing our paper AdapterHub: A Framework for Adapting Transformers:
@inproceedings{pfeiffer2020AdapterHub,
title={AdapterHub: A Framework for Adapting Transformers},
author={Pfeiffer, Jonas and
R{\"u}ckl{\'e}, Andreas and
Poth, Clifton and
Kamath, Aishwarya and
Vuli{\'c}, Ivan and
Ruder, Sebastian and
Cho, Kyunghyun and
Gurevych, Iryna},
booktitle={Proceedings of the 2020 Conference on Empirical Methods in Natural Language Processing: System Demonstrations},
pages={46--54},
year={2020}
}
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Hashes for adapter-transformers-2.1.0.tar.gz
Algorithm | Hash digest | |
---|---|---|
SHA256 | 80618069d990e4348167f29ce35c65bca0b3ec945a3e34b91f2facd4ef01fb3b |
|
MD5 | a9105648156fa1bde2c2949a12dcd210 |
|
BLAKE2b-256 | 9ca317e317433d92135d14a252975cbe88e46110b391bba1b60c9580c5fed73b |
Hashes for adapter_transformers-2.1.0-py3-none-any.whl
Algorithm | Hash digest | |
---|---|---|
SHA256 | 4c4391aaafbfe261539fbbc4f4f786b0375a1f9fada54e1570c545d66128b47e |
|
MD5 | 18d36debd61e984a323810bd9384f19c |
|
BLAKE2b-256 | dca3a29b98cecaf75c0e0891a968057af7b00ade69076ff2bfc4b7a0190e97c7 |