Skip to main content

Language Modelling Tasks as Objects (LaMoTO) provides a framework for language model training (masked and causal, pretraining and finetuning) where the tasks, not just the models, are classes themselves.

Project description

LaMoTO: Language Modelling Tasks as Objects

Language Modelling Tasks as Objects (LaMoTO) provides a framework for language model training (masked and causal, pretraining and finetuning) where the tasks, not just the models, are classes themselves. It abstracts over the HuggingFace transformers.Trainer with one goal: reduce the entire model training process to a single method call task.train(hyperparameters).

pip install lamoto[all]

Note: to be able to use the W&B integration, make sure you first run wandb login in a command-line terminal on the system you want to run on.

Usage

Let's say you want to train a RoBERTa-base model for dependency parsing (for which, by the way, there is no HuggingFace class). This is how you would do that in LaMoTO, supported by the magic of ArchIt:

from archit.instantiation.basemodels import RobertaBaseModel
from archit.instantiation.heads import DependencyParsingHeadConfig, BaseModelExtendedConfig
from lamoto.tasks import DP
from lamoto.training.auxiliary.hyperparameters import getDefaultHyperparameters

# Define task hyperparameters.
hp = getDefaultHyperparameters()
hp.model_config_or_checkpoint = "roberta-base"
hp.archit_basemodel_class = RobertaBaseModel
hp.archit_head_config = DependencyParsingHeadConfig(
    head_dropout=0.33,
    extended_model_config=BaseModelExtendedConfig(
        layer_pooling=1
    )
)

# Instantiate language modelling task as object, and train model.
task = DP()
task.train(hyperparameters=hp)

Features

  • Train models on >15 pre-training/fine-tuning tasks. See a list by importing from lamoto.tasks.
    • Model architectures come from ArchIt, which means that as long as you have a BaseModel wrapper for your language model backbone, you can train it on any task, regardless of whether you wrote code defining the backbone-with-head architecture required for that task.
    • Custom (i.e. given) architectures are also supported.
  • Evaluate models with a superset of the metrics in HuggingFace's evaluate, with custom inference procedures (see e.g. strided pseudo-perplexity or bits-per-character).
  • Augment datasets before training or evaluating by somehow perturbing them.
  • Supports TkTkT tokenisers.
  • W&B integration.

Alternative packages

There exist other libraries that abstract across training tasks in an effort to avoid heavily dedicated training scripts. I'm aware of the following packages (although I'm not sure how extensible they are):

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

lamoto-2026.5.1.tar.gz (108.7 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

lamoto-2026.5.1-py3-none-any.whl (132.9 kB view details)

Uploaded Python 3

File details

Details for the file lamoto-2026.5.1.tar.gz.

File metadata

  • Download URL: lamoto-2026.5.1.tar.gz
  • Upload date:
  • Size: 108.7 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: Hatch/1.16.5 cpython/3.13.13 HTTPX/0.28.1

File hashes

Hashes for lamoto-2026.5.1.tar.gz
Algorithm Hash digest
SHA256 cda2158a552c6206ca001a05abcdbe4a7d20bb7dfd42bbdc5b80f20eba043144
MD5 a001b7328c8a7d61b1b1b6d72cff5313
BLAKE2b-256 ec3c28370228e563a625bef1869eea76b68574a95c8fb24f5898823ee73784fd

See more details on using hashes here.

File details

Details for the file lamoto-2026.5.1-py3-none-any.whl.

File metadata

  • Download URL: lamoto-2026.5.1-py3-none-any.whl
  • Upload date:
  • Size: 132.9 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: Hatch/1.16.5 cpython/3.13.13 HTTPX/0.28.1

File hashes

Hashes for lamoto-2026.5.1-py3-none-any.whl
Algorithm Hash digest
SHA256 af7d99e8e508bd250116de5667f186db487cce5c4894a43a293224e672d0c591
MD5 5526e199d70bb5052fb347b5d0d606d4
BLAKE2b-256 1c0ff399aaa145501343791df93402afd2d2b96fb11da690a5d69023b687e5c4

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page