Skip to main content

A stateless runner / deployment system for MESA models

Project description

MESA Runner

A stateless runner for deploying MESA registered models onto GSTT Infrastructure. It syncs models from S3, reads unprocessed documents from Snowflake, runs inference, and writes results back.

Requirements

  • Python 3.13+
  • uv package manager

Installation

Remote Inference (Default)

For remote inference via OpenAI-compatible endpoints:

uv sync

Offline Inference (Optional)

For local GPU inference with vLLM, install the optional dependency:

uv sync --group vllm-offline

Configuration

Create a config.yaml file (see example below):

Remote Inference Example

my_source:
  model_name: "your-model-name"

  inference:
    openai_endpoint: "http://localhost:5000/v1"

  storage:
    type: snowflake
    source_database: "str"
    source_schema: "str"
    source_table: "str"

    sink_database: "str"
    sink_schema: "str"
    sink_table: "str"

    connection_params:
      account: "str"
      user: "str"
      role: "str"
      password: "str"
      warehouse: "str"
      database: "str"

Offline Inference Example

my_source:
  model_s3_uri: "s3://aicentre-nlpteam-mesa-build/models/oncoqwen/oncoqwen_1/"

  inference:
    max_model_len: 18000

  storage:
    type: snowflake
    source_database: "str"
    source_schema: "str"
    source_table: "str"

    sink_database: "str"
    sink_schema: "str"
    sink_table: "str"

    connection_params:
      account: "str"
      user: "str"
      role: "str"
      password: "str"
      warehouse: "str"
      database: "str"

Usage

# Run with default config.yaml
mesa_runner

# Or specify a config file
mesa_runner --config /path/to/config.yaml

# Dry run mode (uses dummy data, does not read or write real data)
mesa_runner --dry-run

The dry run mode is useful for testing the runner without accessing real data sources or sinks. It generates 5 dummy documents by default and logs all write operations instead of executing them.

Docker

# Remote inference (default)
docker build -t mesa-runner .
docker run mesa-runner

# Offline inference (includes vLLM)
docker build --target offline -t mesa-runner:offline .
docker run --gpus all mesa-runner:offline

Development

# Run linting and tests
make test

# Auto-fix linting issues
make fix

# Run tests with coverage
make cov

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

londonaicentre_mesa_runner-1.2.3.tar.gz (30.8 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

londonaicentre_mesa_runner-1.2.3-py3-none-any.whl (25.5 kB view details)

Uploaded Python 3

File details

Details for the file londonaicentre_mesa_runner-1.2.3.tar.gz.

File metadata

  • Download URL: londonaicentre_mesa_runner-1.2.3.tar.gz
  • Upload date:
  • Size: 30.8 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: uv/0.11.2 {"installer":{"name":"uv","version":"0.11.2","subcommand":["publish"]},"python":null,"implementation":{"name":null,"version":null},"distro":{"name":"Amazon Linux","version":"2023","id":null,"libc":null},"system":{"name":null,"release":null},"cpu":null,"openssl_version":null,"setuptools_version":null,"rustc_version":null,"ci":true}

File hashes

Hashes for londonaicentre_mesa_runner-1.2.3.tar.gz
Algorithm Hash digest
SHA256 4d6e592d60b6bf84deace1a6643765be1265d769093e5524edf389af5773e3d3
MD5 71e2561814195c8d1d6555fe4baae083
BLAKE2b-256 af508fddf44afe0ba48d0cb9539829cc1201d5d940c1265b92bf0b0668b90565

See more details on using hashes here.

File details

Details for the file londonaicentre_mesa_runner-1.2.3-py3-none-any.whl.

File metadata

  • Download URL: londonaicentre_mesa_runner-1.2.3-py3-none-any.whl
  • Upload date:
  • Size: 25.5 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: uv/0.11.2 {"installer":{"name":"uv","version":"0.11.2","subcommand":["publish"]},"python":null,"implementation":{"name":null,"version":null},"distro":{"name":"Amazon Linux","version":"2023","id":null,"libc":null},"system":{"name":null,"release":null},"cpu":null,"openssl_version":null,"setuptools_version":null,"rustc_version":null,"ci":true}

File hashes

Hashes for londonaicentre_mesa_runner-1.2.3-py3-none-any.whl
Algorithm Hash digest
SHA256 0553ad714a0a93c880a282bff832400fffcfb7846134bb638cf1864a82ff1cc3
MD5 14233a93095aadf9f81c8eeedd217019
BLAKE2b-256 d1a4c1adbfcf3a28937745820f5e2e911c175ef644322a3e27b88ddac62f01f9

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page