Sequence-to-Sequence framework for Neural Machine Translation
Project description
Sockeye
This package contains the Sockeye project, a sequence-to-sequence framework for Neural Machine Translation based on Apache MXNet (Incubating). It implements state-of-the-art encoder-decoder architectures, such as:
- Deep Recurrent Neural Networks with Attention [Bahdanau, '14]
- Transformer Models with self-attention [Vaswani et al, '17]
- Fully convolutional sequence-to-sequence models [Gehring et al, '17]
In addition, it provides an experimental image-to-description module that can be used for image captioning. Recent developments and changes are tracked in our CHANGELOG.
If you have any questions or discover problems, please file an issue. You can also send questions to sockeye-dev-at-amazon-dot-com.
Documentation
For information on how to use Sockeye, please visit our documentation. Developers may be interested in our developer guidelines.
Citation
For technical information about Sockeye, see our paper on the arXiv (BibTeX):
Felix Hieber, Tobias Domhan, Michael Denkowski, David Vilar, Artem Sokolov, Ann Clifton and Matt Post. 2017. Sockeye: A Toolkit for Neural Machine Translation. ArXiv e-prints.
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Hashes for sockeye-1.18.97-py3-none-any.whl
Algorithm | Hash digest | |
---|---|---|
SHA256 | 1c67706c9764d035bcf3fc1d378efda5b8c5cb55bc2f4a8b3cfbdb29f713d4ea |
|
MD5 | 50c43959eb84a9054ff9dfea53fc3faf |
|
BLAKE2b-256 | d77882353d73d80272d52e7fdd58cf7f1e1bee26c78b0213ba147832511447a1 |