Skip to main content

Facebook AI Research Sequence-to-Sequence Toolkit

Project description



Support Ukraine MIT License Latest Release Build Status Documentation Status CicleCI Status


Fairseq(-py) is a sequence modeling toolkit that allows researchers and developers to train custom models for translation, summarization, language modeling and other text generation tasks.

We provide reference implementations of various sequence modeling papers:

List of implemented papers

What's New:

Previous updates

Features:

We also provide pre-trained models for translation and language modeling with a convenient torch.hub interface:

en2de = torch.hub.load('pytorch/fairseq', 'transformer.wmt19.en-de.single_model')
en2de.translate('Hello world', beam=5)
# 'Hallo Welt'

See the PyTorch Hub tutorials for translation and RoBERTa for more examples.

Requirements and Installation

  • PyTorch version >= 1.5.0
  • Python version >= 3.6
  • For training new models, you'll also need an NVIDIA GPU and NCCL
  • To install fairseq and develop locally:
git clone https://github.com/pytorch/fairseq
cd fairseq
pip install --editable ./

# on MacOS:
# CFLAGS="-stdlib=libc++" pip install --editable ./

# to install the latest stable release (0.10.x)
# pip install fairseq
  • For faster training install NVIDIA's apex library:
git clone https://github.com/NVIDIA/apex
cd apex
pip install -v --no-cache-dir --global-option="--cpp_ext" --global-option="--cuda_ext" \
  --global-option="--deprecated_fused_adam" --global-option="--xentropy" \
  --global-option="--fast_multihead_attn" ./
  • For large datasets install PyArrow: pip install pyarrow
  • If you use Docker make sure to increase the shared memory size either with --ipc=host or --shm-size as command line options to nvidia-docker run .

Getting Started

The full documentation contains instructions for getting started, training new models and extending fairseq with new model types and tasks.

Pre-trained models and examples

We provide pre-trained models and pre-processed, binarized test sets for several tasks listed below, as well as example training and evaluation commands.

We also have more detailed READMEs to reproduce results from specific papers:

Join the fairseq community

License

fairseq(-py) is MIT-licensed. The license applies to the pre-trained models as well.

Citation

Please cite as:

@inproceedings{ott2019fairseq,
  title = {fairseq: A Fast, Extensible Toolkit for Sequence Modeling},
  author = {Myle Ott and Sergey Edunov and Alexei Baevski and Angela Fan and Sam Gross and Nathan Ng and David Grangier and Michael Auli},
  booktitle = {Proceedings of NAACL-HLT 2019: Demonstrations},
  year = {2019},
}

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

fairseq-0.12.0.tar.gz (9.6 MB view hashes)

Uploaded Source

Built Distributions

fairseq-0.12.0-cp38-cp38-manylinux_2_5_x86_64.manylinux1_x86_64.whl (11.0 MB view hashes)

Uploaded CPython 3.8 manylinux: glibc 2.5+ x86-64

fairseq-0.12.0-cp38-cp38-macosx_10_9_x86_64.whl (10.4 MB view hashes)

Uploaded CPython 3.8 macOS 10.9+ x86-64

fairseq-0.12.0-cp37-cp37m-manylinux_2_5_x86_64.manylinux1_x86_64.whl (10.9 MB view hashes)

Uploaded CPython 3.7m manylinux: glibc 2.5+ x86-64

fairseq-0.12.0-cp37-cp37m-macosx_10_9_x86_64.whl (10.4 MB view hashes)

Uploaded CPython 3.7m macOS 10.9+ x86-64

fairseq-0.12.0-cp36-cp36m-manylinux_2_5_x86_64.manylinux1_x86_64.whl (10.9 MB view hashes)

Uploaded CPython 3.6m manylinux: glibc 2.5+ x86-64

fairseq-0.12.0-cp36-cp36m-macosx_10_9_x86_64.whl (10.4 MB view hashes)

Uploaded CPython 3.6m macOS 10.9+ x86-64

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page