Skip to main content

TEA - Translation Engine Architect

Project description

TEA - Translation Engine Architect

A command line tool to create translation engine.

Install

First install pipx then (x being your python version):

pipx install pangeamt-tea

Usage

Step 1: Create a new project

tea new --customer customer --src_lang es --tgt_lang en --flavor automotion --version 2

This command will create the project directory structure:

├── customer_es_en_automotion_2
│   ├── config.yml
│   └── data

Then enter in the directory

cd customer_es_en_automotion_2

Step 2: Configuration

Tokenizer

A tokenizer can be applied to source and target

tea config tokenizer --src mecab  --tgt moses

To list all available tokenizer:

tea config tokenizer --help

if you would not like to use tokenizers you can run:

tea config tokenizer -s none -t none

Truecaser

tea config truecaser --src --tgt

if you would not like to use truecaser you can run:

tea config tokenizer

BPE / SentencePiece

For joint BPE:

tea config bpe -j

For not joint BPE:

 tea bpe -s -t

For using sentencepiece:

tea config bpe --sentencepiece 

and options --model_type TEXT (unigram) --vocab_size INTEGER (8000) if you would like to modify them from default

Processors

tea config processors -s "{processors}"

being processors a list of preprocesses and postprocesses.

To list all available processors:

tea config processors --list

In order to test the processors that will be applied you can run this script in the main TEA project directory:

debug_normalizers.py <config_file> <src_test> <tgt_test>

being config_file the yaml config and src_test and tgt_test the segments to test for source and target text.

Prepare

tea config prepare --shard_size 100000 --src_seq_length 400 --tgt_seq_length 400

Translation model

tea config translation-model -n onmt

Step 3:

Copy some multilingual ressources (.tmx, bilingual files, .af ) into the 'data' directory

Step 4: Run

Create workflow

tea worflow new

Clean the data passing the normalizers and validators:

tea workflow clean -n {clean_th} -d

being clean_th the number of threads.

Preprocess the data (split data in train, dev or test, tokenization, BPE):

tea workflow prepare -n {prepare_th} -s 3

being prepare_th the number of threads.

Training model

tea workflow train --gpu 0

if you do not want to use gpu do not use this parameter.

Evaluate model

tea workflow eval --step {step} --src file.src --ref file.tgt --log file.log --out file.out --gpu 0

Project details


Release history Release notifications | RSS feed

Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

pangeamt-tea-0.2.33.tar.gz (19.2 kB view details)

Uploaded Source

Built Distribution

pangeamt_tea-0.2.33-py3-none-any.whl (25.3 kB view details)

Uploaded Python 3

File details

Details for the file pangeamt-tea-0.2.33.tar.gz.

File metadata

  • Download URL: pangeamt-tea-0.2.33.tar.gz
  • Upload date:
  • Size: 19.2 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.3.0 pkginfo/1.7.0 requests/2.25.1 setuptools/47.1.0 requests-toolbelt/0.9.1 tqdm/4.30.0 CPython/3.7.11

File hashes

Hashes for pangeamt-tea-0.2.33.tar.gz
Algorithm Hash digest
SHA256 dbb45e8c14382dff2c34b7e520180029a9c83b6d3e6c9c40563c70a73d201ea9
MD5 a287a9733ed0c070890c4ba9437d9dd7
BLAKE2b-256 9f1f9238cdbb6ae5b1e5b1ea89ba2150312a2a2d9abbd852809caa33b75a82c3

See more details on using hashes here.

File details

Details for the file pangeamt_tea-0.2.33-py3-none-any.whl.

File metadata

  • Download URL: pangeamt_tea-0.2.33-py3-none-any.whl
  • Upload date:
  • Size: 25.3 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.3.0 pkginfo/1.7.0 requests/2.25.1 setuptools/47.1.0 requests-toolbelt/0.9.1 tqdm/4.30.0 CPython/3.7.11

File hashes

Hashes for pangeamt_tea-0.2.33-py3-none-any.whl
Algorithm Hash digest
SHA256 7cda9da54a18ff4232f1aad8b6b543033d0422d47724c1eec24bec232a321c47
MD5 3c72e3ca6cafedc8f384222ce25c2adc
BLAKE2b-256 f0c8f4635ad6796387ed50cc28279a3079d88a936633f1e56692a143786b509d

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page