Skip to main content

Sequence-to-Sequence framework for Neural Machine Translation

Project description

Sockeye
=======

|Documentation Status| |Build Status|

This package contains the Sockeye project, a sequence-to-sequence
framework for Neural Machine Translation based on Apache MXNet. It
implements the well-known encoder-decoder architecture with attention.

If you are interested in collaborating or have any questions, please
submit a pull request or issue. You can also send questions to
*sockeye-dev-at-amazon-dot-com*.

Dependencies
------------

Sockeye requires: - **Python3** -
`MXNet-0.10.0 <https://github.com/dmlc/mxnet/tree/v0.10.0>`__ - numpy

Installation
------------

For AWS DeepLearning AMI users
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~

AWS DeepLearning AMI users only need to run the following line to
install sockeye:

.. code:: bash

> sudo pip3 install sockeye --no-deps

For other environments, you can choose between installing via pip or
directly from source. ### pip

CPU
^^^

.. code:: bash

> pip install sockeye

GPU
^^^

If you want to run sockeye on a GPU you need to make sure your version
of Apache MXNet contains the GPU code. Depending on your version of CUDA
you can do this by running the following for CUDA 8.0:

.. code:: bash

> wget https://raw.githubusercontent.com/awslabs/sockeye/master/requirements.gpu-cu80.txt
> pip install sockeye --no-deps -r requirements.gpu-cu80.txt
> rm requirements.gpu-cu80.txt

or the following for CUDA 7.5:

.. code:: bash

> wget https://raw.githubusercontent.com/awslabs/sockeye/master/requirements.gpu-cu75.txt
> pip install sockeye --no-deps -r requirements.gpu-cu75.txt
> rm requirements.gpu-cu75.txt

>From Source
~~~~~~~~~~~

CPU
^^^

If you want to just use sockeye without extending it, simply install it
via

.. code:: bash

> python setup.py install

after cloning the repository from git.

GPU
^^^

If you want to run sockeye on a GPU you need to make sure your version
of Apache MXNet contains the GPU code. Depending on your version of CUDA
you can do this by running the following for CUDA 8.0:

.. code:: bash

> python setup.py install -r requirements.gpu-cu80.txt

or the following for CUDA 7.5:

.. code:: bash

> python setup.py install -r requirements.gpu-cu75.txt

Optional dependencies
~~~~~~~~~~~~~~~~~~~~~

In order to track learning curves during training you can optionally
install dmlc's tensorboard fork (``pip install tensorboard``). If you
want to create alignment plots you will need to install matplotlib
(``pip install matplotlib``).

In general you can install all optional dependencies from the Sockeye
source folder using:

.. code:: bash

> pip install -e '.[optional]'

*AWS DeepLearning AMI user need to use python3 command instead of the
python*

Running sockeye
~~~~~~~~~~~~~~~

After installation, command line tools such as *sockeye-train,
sockeye-translate, sockeye-average* and *sockeye-embeddings* are
available. Alternatively, if the sockeye directory is on your PYTHONPATH
you can run the modules directly. For example *sockeye-train* can also
be invoked as

.. code:: bash

> python -m sockeye.train <args>

*AWS DeepLearning AMI user need to use python3 command instead of the
python*

First Steps
-----------

Train
~~~~~

In order to train your first Neural Machine Translation model you will
need two sets of parallel files: one for training and one for
validation. The latter will be used for computing various metrics during
training. Each set should consist of two files: one with source
sentences and one with target sentences (translations). Both files
should have the same number of lines, each line containing a single
sentence. Each sentence should be a whitespace delimited list of tokens.

Say you wanted to train a German to English translation model, then you
would call sockeye like this:

.. code:: bash

> python -m sockeye.train --source sentences.de \
--target sentences.en \
--validation-source sentences.dev.de \
--validation-target sentences.dev.en \
--use-cpu \
--output <model_dir>

After training the directory ** will contain all model artifacts such as
parameters and model configuration.

Translate
~~~~~~~~~

Input data for translation should be in the same format as the training
data (tokenization, preprocessing scheme). You can translate as follows:

.. code:: bash

> python -m sockeye.translate --models <model_dir> --use-cpu

This will take the best set of parameters found during training and then
translate strings from STDIN and write translations to STDOUT.

For more detailed examples check out our user documentation.

.. |Documentation Status| image:: https://readthedocs.org/projects/sockeye/badge/?version=latest
:target: http://sockeye.readthedocs.io/en/latest/?badge=latest
.. |Build Status| image:: https://travis-ci.org/awslabs/sockeye.svg?branch=master
:target: https://travis-ci.org/awslabs/sockeye


Project details


Release history Release notifications | RSS feed

Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

sockeye-1.1.1.tar.gz (114.2 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

sockeye-1.1.1-py3-none-any.whl (132.4 kB view details)

Uploaded Python 3

File details

Details for the file sockeye-1.1.1.tar.gz.

File metadata

  • Download URL: sockeye-1.1.1.tar.gz
  • Upload date:
  • Size: 114.2 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No

File hashes

Hashes for sockeye-1.1.1.tar.gz
Algorithm Hash digest
SHA256 6145fdb40b845122b9839d286d570cbcd1425dab6638dafb7003b0e333130c73
MD5 a4b330167fca3dd1b9cc767a00b8032b
BLAKE2b-256 9a3456a8d61cedb4f728089bd0a2d09f30551da81883ea5d59118cdec09a644d

See more details on using hashes here.

File details

Details for the file sockeye-1.1.1-py3-none-any.whl.

File metadata

File hashes

Hashes for sockeye-1.1.1-py3-none-any.whl
Algorithm Hash digest
SHA256 aee077bcaa72afac6d9463fad7d4af29b22aceb4e144d090333053ccb4af2e49
MD5 c80f8ccd146284d8b5c056a2a868d76e
BLAKE2b-256 8d291df1f62ab4d6260cdd8e743b92cfa1daa6f7ab7d8eab890c01e187da2372

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page