Skip to main content

Morphoformer with CELMoE-based multilingual morphology, typed training pipeline, and publishable CLI.

Project description

morphoformer

morphoformer is the application package of the Morph_v4 stack. It combines character-level vocabularies, dataset tooling, typed training utilities, reusable Transformer blocks, and the generic CELMoE hierarchy into a trainable multilingual morphology system.

PyPI package name:

pip install morphoformer

Import name:

import morphoformer

What this package is

Unlike the libraries under libs/, morphoformer is not just a toolkit piece. It is the runnable application layer:

  • configuration loading
  • CLI commands
  • model wiring
  • trainer
  • inference entry points

It depends on these independently publishable packages:

  • chartoken-vp
  • celmoe-vp
  • sigmorphon-vp
  • torchblocks-vp
  • trainkit-vp

Architecture summary

The current model builds a three-level expert hierarchy:

  • universal
  • family
  • language

The actual orchestration is handled by HierarchicalCELMoE. morphoformer supplies the morphology-specific expert blocks, embeddings, routing, and output heads.

Input side:

  • character embeddings
  • feature embeddings
  • language embeddings
  • feature-to-token broadcast fusion

Expert side:

  • MorphExpertStack built from torchblocks-vp
  • configurable attention, norm, feedforward, adapter, convolution, and position modules
  • routing by language family and language code

Output side:

  • logits
  • universal_logits
  • family_logits
  • language_logits

Those outputs are consumed by the multi-loss training setup in trainkit-vp.

Installation

Requirements:

  • Python >=3.14
  • PyTorch >=2.0

Install from PyPI:

pip install morphoformer

For local development from this repository, publish or install the dependent libraries first, because they are versioned as separate packages.

CLI

The package exposes the morphoformer console command.

Available subcommands:

  • download
  • inspect-config
  • train
  • infer

Download data

List languages:

morphoformer download --list-languages

Download specific languages and merge them:

morphoformer download --lang rus,krl,afb --out-dir data --merge

Download everything known by the downloader:

morphoformer download --lang all --out-dir data

Inspect config

morphoformer inspect-config --config dev/config.toml

Train

morphoformer train --config dev/config.toml

The trainer writes the best checkpoint into the configured output directory.

Infer

morphoformer infer `
  --config dev/config.toml `
  --checkpoint artifacts/v4_omni/best.pt `
  --lemma write `
  --tags "V;PST" `
  --lang eng

Configuration

The TOML config is loaded into typed dataclasses:

  • DataConfig
  • LanguageConfig
  • ModelConfig
  • OptimizerConfig
  • TrainConfig
  • DecodeConfig
  • MorphoformerConfig

Main config sections:

  • [data]
  • [model]
  • [optimizer]
  • [train]
  • [decode]
  • [languages.<code>]

Example:

[data]
train_path = "data/merged_train.tsv"
dev_path = "data/merged_dev.tsv"
max_len = 96
max_features = 12

[model]
d_model = 768
dim_ff = 2304
num_heads = 12
num_kv_heads = 4
dropout = 0.12
max_positions = 256
feature_dim = 128
attention = "gqa"
feedforward = "swiglu"
norm = "rmsnorm"
adapter = "language_conditioned"
universal_layers = 8
family_layers = 2
language_layers = 2

[train]
stage = "joint"
epochs = 10
batch_size = 64
warmup_steps = 500
total_steps = 12000
output_dir = "artifacts/v4_omni"

[languages.rus]
family = "slavic"

Training flow

The trainer does the following:

  1. load train and dev TSV data
  2. build character and feature vocabularies
  3. build the language-to-id map from config
  4. pre-encode datasets into MorphDataset
  5. instantiate Morphoformer
  6. freeze or unfreeze stages according to train.stage
  7. optimize with AdamW, warmup cosine schedule, and AMP when enabled
  8. evaluate on the dev set each epoch
  9. save the best checkpoint

The loss is a weighted combination of:

  • final output loss
  • universal expert loss
  • family expert loss
  • language expert loss

Checkpoint contents

Saved checkpoints include:

  • model_state
  • optimizer_state
  • char_vocab
  • feature_vocab
  • language_to_id
  • epoch

That is enough to restore the model together with the exact vocabularies used during training.

Inference path

predict_form(...):

  • encodes the lemma with CharVocab
  • encodes tags with FeatureVocab
  • maps the language string to language_id
  • runs greedy decoding through the model
  • decodes predicted ids back into a surface string

Relationship to celmoe-vp

This package is where the task-specific part begins.

celmoe-vp itself stays generic and knows nothing about morphology. morphoformer is responsible for:

  • choosing hierarchy levels
  • defining expert block structure
  • mapping languages to families
  • attaching morphology-specific heads
  • converting expert outputs into token logits

That split is important because the architecture package and the application package are published separately.

Publishing and versioning

In Morph_v4 the libraries are not bundled into one mega-package. Each package is published independently and morphoformer depends on versioned releases of the lower-level libs.

That means before publishing morphoformer, you should publish compatible versions of:

  • chartoken-vp
  • celmoe-vp
  • sigmorphon-vp
  • torchblocks-vp
  • trainkit-vp

The repository includes publish.ps1 to build, version, and publish the stack in dependency order.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

morphoformer-4.2.0.tar.gz (18.2 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

morphoformer-4.2.0-py3-none-any.whl (18.3 kB view details)

Uploaded Python 3

File details

Details for the file morphoformer-4.2.0.tar.gz.

File metadata

  • Download URL: morphoformer-4.2.0.tar.gz
  • Upload date:
  • Size: 18.2 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.14.2

File hashes

Hashes for morphoformer-4.2.0.tar.gz
Algorithm Hash digest
SHA256 22a1939dedfa1357475384b055e691087c9b01b1962524fcaf8ee2e48faef44d
MD5 4c0f35f81553f7c305c289e65e8f0178
BLAKE2b-256 121b1d1004957f013aadb57bebf43b1d136460f8a4aec7cb6d50c361198fbc27

See more details on using hashes here.

File details

Details for the file morphoformer-4.2.0-py3-none-any.whl.

File metadata

  • Download URL: morphoformer-4.2.0-py3-none-any.whl
  • Upload date:
  • Size: 18.3 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.14.2

File hashes

Hashes for morphoformer-4.2.0-py3-none-any.whl
Algorithm Hash digest
SHA256 bdf4af9fc7d74ac9d65bff183e0ca7ef5b63ede95c32d23dd1ab7e8f2395e503
MD5 eb06e4c31f8a8cea109222368527b082
BLAKE2b-256 43ba3796571613873aa4e8fd6f2e4aa4a76bd4ae6ff8080767f11b43e56aceb5

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page