Sequence-to-Sequence framework for Neural Machine Translation
Project description
Sockeye
This package contains the Sockeye project, a sequence-to-sequence framework for Neural Machine Translation based on Apache MXNet (Incubating). It implements state-of-the-art encoder-decoder architectures, such as:
- Deep Recurrent Neural Networks with Attention [Bahdanau, '14]
- Transformer Models with self-attention [Vaswani et al, '17]
- Fully convolutional sequence-to-sequence models [Gehring et al, '17]
In addition, it provides an experimental image-to-description module that can be used for image captioning. Recent developments and changes are tracked in our CHANGELOG.
If you have any questions or discover problems, please file an issue. You can also send questions to sockeye-dev-at-amazon-dot-com.
Documentation
For information on how to use Sockeye, please visit our documentation. Developers may be interested in our developer guidelines.
Citation
For technical information about Sockeye, see our paper on the arXiv (BibTeX):
Felix Hieber, Tobias Domhan, Michael Denkowski, David Vilar, Artem Sokolov, Ann Clifton and Matt Post. 2017. Sockeye: A Toolkit for Neural Machine Translation. ArXiv e-prints.
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Hashes for sockeye-1.18.78-py3-none-any.whl
Algorithm | Hash digest | |
---|---|---|
SHA256 | bcec400e145fd58c4e265b96371bcbc70fe3287d3b1cf1fe7bdbe54685819b82 |
|
MD5 | fd522d1d4c07e308b98f0030a8409a54 |
|
BLAKE2b-256 | 0678fbc4c997535d7f4d231b4169bbf550725b50f21cfd401cd8d74e6e15a11e |