A friendly fork of Huggingface's Transformers, adding Adapters to PyTorch language models
Project description
adapter-transformers
A friendly fork of Huggingface's Transformers, adding Adapters to PyTorch language models
adapter-transformers
is an extension of Huggingface's Transformers library, integrating adapters into state-of-the-art language models by incorporating AdapterHub, a central repository for pre-trained adapter modules.
This library can be used as a drop-in replacement for Huggingface Transformers and regulary synchronizes new upstream changes.
Installation
adapter-transformers currently supports Python 3.6+ and PyTorch 1.1.0+. After installing PyTorch, you can install adapter-transformers from PyPI ...
pip install -U adapter-transformers
... or from source by cloning the repository:
git clone https://github.com/adapter-hub/adapter-transformers.git
cd adapter-transformers
pip install .
Getting Started
HuggingFace's great documentation on getting started with Transformers can be found here. adapter-transformers is fully compatible with Transformers.
To get started with adapters, refer to these locations:
- https://docs.adapterhub.ml, our documentation on training and using adapters with adapter-transformers
- https://adapterhub.ml to explore available pre-trained adapter modules and share your own adapters
- Examples folder of this repository containing HuggingFace's example training scripts, many adapted for training adapters
Citation
If you find this library useful, please cite our paper AdapterHub: A Framework for Adapting Transformers:
@article{pfeiffer2020AdapterHub,
title={AdapterHub: A Framework for Adapting Transformers},
author={Jonas Pfeiffer and
Andreas R\"uckl\'{e} and
Clifton Poth and
Aishwarya Kamath and
Ivan Vuli\'{c} and
Sebastian Ruder and
Kyunghyun Cho and
Iryna Gurevych},
journal={arXiv preprint},
year={2020},
url={https://arxiv.org/abs/2007.07779}
}
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Hashes for adapter-transformers-1.0.0.tar.gz
Algorithm | Hash digest | |
---|---|---|
SHA256 | bf45696614e0811c1212cc8b6f80b933154e71382c4e4359196b604a74250cae |
|
MD5 | 3836f081127916b9c518c5fc7ce404f6 |
|
BLAKE2b-256 | 0d15bea12355c61bbd3d9f22ff97a257815e4be33028779d1e4f0cbdb3c84637 |
Hashes for adapter_transformers-1.0.0-py3-none-any.whl
Algorithm | Hash digest | |
---|---|---|
SHA256 | b13ddecafe5f37b1e5c52c2fc69533b4b8a74ddb978c9c0a47ca1b8e183df40f |
|
MD5 | 84aa5362b364efea1041ea38767ce7e4 |
|
BLAKE2b-256 | 537ba91ec87b2d750795f78052c9c90388d2eb2f9f4e76bfb4fbcf1c832d1efc |