A friendly fork of Huggingface's Transformers, adding Adapters to PyTorch language models
Project description
adapter-transformers
A friendly fork of HuggingFace's Transformers, adding Adapters to PyTorch language models
adapter-transformers
is an extension of HuggingFace's Transformers library, integrating adapters into state-of-the-art language models by incorporating AdapterHub, a central repository for pre-trained adapter modules.
This library can be used as a drop-in replacement for HuggingFace Transformers and regularly synchronizes new upstream changes.
Quick tour
adapter-transformers currently supports Python 3.6+ and PyTorch 1.1.0+. After installing PyTorch, you can install adapter-transformers from PyPI ...
pip install -U adapter-transformers
... or from source by cloning the repository:
git clone https://github.com/adapter-hub/adapter-transformers.git
cd adapter-transformers
pip install .
Getting Started
HuggingFace's great documentation on getting started with Transformers can be found here. adapter-transformers is fully compatible with Transformers.
To get started with adapters, refer to these locations:
- Colab notebook tutorials, a series notebooks providing an introduction to all the main concepts of (adapter-)transformers and AdapterHub
- https://docs.adapterhub.ml, our documentation on training and using adapters with adapter-transformers
- https://adapterhub.ml to explore available pre-trained adapter modules and share your own adapters
- Examples folder of this repository containing HuggingFace's example training scripts, many adapted for training adapters
Citation
If you find this library useful, please cite our paper AdapterHub: A Framework for Adapting Transformers:
@inproceedings{pfeiffer2020AdapterHub,
title={AdapterHub: A Framework for Adapting Transformers},
author={Pfeiffer, Jonas and
R{\"u}ckl{\'e}, Andreas and
Poth, Clifton and
Kamath, Aishwarya and
Vuli{\'c}, Ivan and
Ruder, Sebastian and
Cho, Kyunghyun and
Gurevych, Iryna},
booktitle={Proceedings of the 2020 Conference on Empirical Methods in Natural Language Processing: System Demonstrations},
pages={46--54},
year={2020}
}
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Hashes for adapter-transformers-1.1.1.tar.gz
Algorithm | Hash digest | |
---|---|---|
SHA256 | 1e61cb0ef09454ae6e86d6cfc1dfcd031e9d9dd9128667ffe32e2e2a7049102b |
|
MD5 | 6ed54b864a510f34010c447ef362a697 |
|
BLAKE2b-256 | fbb804d9b117eb1bb21eec64a6fbde33f891dcac78623249a764aecc1074e9a4 |
Hashes for adapter_transformers-1.1.1-py3-none-any.whl
Algorithm | Hash digest | |
---|---|---|
SHA256 | 0b9947840ed25e8edc709459c9b8ec003f4b1e959ab39d6d203a961236f805f5 |
|
MD5 | 6fe6142c0ae6b1ee649bb704144333d6 |
|
BLAKE2b-256 | 9e8a5a4cd4ed09201f76d5eb6d7a36231bc98da2bfa28e2d03c7abfafcdf6baf |