Skip to main content

A python implementation of OpenNMT

Project description

OpenNMT-py: Open-Source Neural Machine Translation and (Large) Language Models

Build Status Documentation Gitter Forum

OpenNMT-py is the PyTorch version of the OpenNMT project, an open-source (MIT) neural machine translation (and beyond!) framework. It is designed to be research friendly to try out new ideas in translation, language modeling, summarization, and many other NLP tasks. Some companies have proven the code to be production ready.

We love contributions! Please look at issues marked with the contributions welcome tag.

Before raising an issue, make sure you read the requirements and the Full Documentation examples.

Unless there is a bug, please use the Forum or Gitter to ask questions.


For beginners:

There is a step-by-step and explained tuto (Thanks to Yasmin Moslem): Tutorial

Please try to read and/or follow before raising newbies issues.

Otherwise you can just have a look at the Quickstart steps


New:

  • Special note on Pytorch v2: up to v2.0.1 dynamic shapes are not handled properly, hence torch.compile() will not work with OpenNMT-py. We have tested nightly (in May) and it works with a small gain. Next version will be 2.1
  • LLM support with converters for: Llama, OpenLlama, Redpajama, MPT-7B, Falcon.
  • Support for 8bit and 4bit quantization along with LoRA adapters, with or without checkpointing.
  • You can finetune 7B and 13B models on a single RTX 24GB with 4-bit quantization.
  • Inference can be forced in 4/8bit using the same layer quantization as in finetuning.
  • Once your model is finetuned you can run inference either with OpenNMT-py or faster with CTranslate2.
  • MMLU evaluation script, see results here

For all usecases including NMT, you can now use Multiquery instead of Multihead attention (faster at training and inference) and remove biases from all Linear (QKV as well as FeedForward modules).

If you used previous versions of OpenNMT-py, you can check the Changelog or the Breaking Changes


Tutorials:

  • How to replicate Vicuna with a 7B or 13B llama (or Open llama, MPT-7B, Redpajama) Language Model: Tuto Vicuna
  • How to finetune NLLB-200 with your dataset: Tuto Finetune NLLB-200
  • How to create a simple OpenNMT-py REST Server: Tuto REST
  • How to create a simple Web Interface: Tuto Streamlit
  • Replicate the WMT17 en-de experiment: WMT17 ENDE

Setup

OpenNMT-py requires:

  • Python >= 3.8
  • PyTorch >= 1.13 <2.1

Install OpenNMT-py from pip:

pip install OpenNMT-py

or from the sources:

git clone https://github.com/OpenNMT/OpenNMT-py.git
cd OpenNMT-py
pip install -e .

Note: if you encounter a MemoryError during installation, try to use pip with --no-cache-dir.

(Optional) Some advanced features (e.g. working pretrained models or specific transforms) require extra packages, you can install them with:

pip install -r requirements.opt.txt

Documentation & FAQs

Full HTML Documentation

FAQs

Acknowledgements

OpenNMT-py is run as a collaborative open-source project. Project was incubated by Systran and Harvard NLP in 2016 in Lua and ported to Pytorch in 2017.

Current maintainers (since 2018):

François Hernandez and Ubiqus Team. Vincent Nguyen (Seedfall)

Citation

If you are using OpenNMT-py for academic work, please cite the initial system demonstration paper published in ACL 2017:

@inproceedings{klein-etal-2017-opennmt,
    title = "{O}pen{NMT}: Open-Source Toolkit for Neural Machine Translation",
    author = "Klein, Guillaume  and
      Kim, Yoon  and
      Deng, Yuntian  and
      Senellart, Jean  and
      Rush, Alexander",
    booktitle = "Proceedings of {ACL} 2017, System Demonstrations",
    month = jul,
    year = "2017",
    address = "Vancouver, Canada",
    publisher = "Association for Computational Linguistics",
    url = "https://www.aclweb.org/anthology/P17-4012",
    pages = "67--72",
}

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

OpenNMT-py-3.4.tar.gz (207.6 kB view details)

Uploaded Source

Built Distribution

OpenNMT_py-3.4-py3-none-any.whl (250.5 kB view details)

Uploaded Python 3

File details

Details for the file OpenNMT-py-3.4.tar.gz.

File metadata

  • Download URL: OpenNMT-py-3.4.tar.gz
  • Upload date:
  • Size: 207.6 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/4.0.2 CPython/3.8.18

File hashes

Hashes for OpenNMT-py-3.4.tar.gz
Algorithm Hash digest
SHA256 a2b7926235f788513e1027fb9d09427adc51604cf78c11be5f45ce6007070e3a
MD5 2ba92297f283439bbcd134df2ef489a2
BLAKE2b-256 9c064b8d50298e0d1f1a8cc4ddb1a1636a8051c104e401228dd081093b8a3f09

See more details on using hashes here.

File details

Details for the file OpenNMT_py-3.4-py3-none-any.whl.

File metadata

  • Download URL: OpenNMT_py-3.4-py3-none-any.whl
  • Upload date:
  • Size: 250.5 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/4.0.2 CPython/3.8.18

File hashes

Hashes for OpenNMT_py-3.4-py3-none-any.whl
Algorithm Hash digest
SHA256 6aec8b44d896486c9e2a4e9d3a536f4dec8949b2a28d105442902e1775e293da
MD5 a335be7234f5e34d3b5cf4a04487abd1
BLAKE2b-256 14af2b36fb4061c38ba51b549a9bde005132272600295f316bb04ae7e00bd609

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page