Skip to main content

Portable tool runtime for OmniBioAI TES (RESULT_URI uploader for s3:// and azureblob://)

Project description

Omni Tool Runtime

Portable, cloud-agnostic execution runtime for OmniBioAI tools


Overview

omni-tool-runtime is a minimal, deterministic execution runtime used by OmniBioAI’s Tool Execution Service (TES) to run individual tools across multiple execution backends, including:

  • Local Docker execution
  • AWS Batch
  • Azure Batch
  • (future) Kubernetes Jobs
  • (future) TES-compatible HPC schedulers

The runtime provides a strict execution contract so that:

  • TES adapters stay thin and backend-specific
  • Tool containers remain portable and backend-agnostic
  • Results are uploaded consistently (S3 / Azure Blob / future backends)

This mirrors the design philosophy used throughout OmniBioAI: separate orchestration from execution, and execution from logic.


What This Runtime Is (and Is Not)

✅ This runtime is

  • A containerized tool launcher

  • Responsible for:

    • Reading tool inputs from environment variables
    • Executing tool logic
    • Writing results.json
    • Uploading results to cloud storage
  • Cloud-agnostic (AWS / Azure supported today)

❌ This runtime is not

  • A workflow engine
  • A scheduler
  • An LLM executor
  • A UI layer

Those responsibilities live elsewhere in OmniBioAI.


Execution Contract (Critical)

All tools executed via omni-tool-runtime must follow this contract.

Environment Variables (Injected by TES Adapter)

Variable Description
TOOL_ID Tool identifier (echo_test, blastn, etc.)
RUN_ID Unique run ID (generated by adapter)
INPUTS_JSON JSON-encoded tool inputs
RESOURCES_JSON JSON-encoded resource request
S3_RESULT_URI (AWS Batch) S3 URI to upload results
RESULT_URI (Azure Batch) azureblob:// URI to upload results

Only one of S3_RESULT_URI or RESULT_URI is expected per run.


Repository Structure

omni-tool-runtime/
├── Dockerfile
├── README.md
├── pyproject.toml
├── omni_tool_runtime/
│   ├── __init__.py
│   ├── result_uri.py          # URI parsing & dispatch
│   ├── upload_result.py       # Unified upload logic
│   └── uploaders/
│       ├── s3_uploader.py
│       └── azureblob_uploader.py
├── tools/
│   └── echo_test/
│       ├── __init__.py
│       └── run.py
└── tests/

Example Tool: echo_test

This is the reference implementation for all future tools.

Behavior

  • Reads INPUTS_JSON
  • Echoes a value
  • Writes results.json
  • Uploads results to configured storage backend

Minimal tool implementation

# tools/echo_test/run.py
import json
import os
from omni_tool_runtime.upload_result import upload_result

def main():
    tool_id = os.environ["TOOL_ID"]
    run_id = os.environ["RUN_ID"]
    inputs = json.loads(os.environ.get("INPUTS_JSON", "{}"))

    text = inputs.get("text", "")

    result = {
        "ok": True,
        "tool_id": tool_id,
        "run_id": run_id,
        "results": {"echo": text},
    }

    upload_result(result)

if __name__ == "__main__":
    main()

How Results Upload Works

upload_result() automatically detects the backend:

Backend URI Example
AWS s3://bucket/prefix/run_id/results.json
Azure azureblob://account/container/path/results.json

The runtime:

  1. Serializes result as JSON
  2. Uploads to correct backend
  3. Prints result to stdout (for debugging)

Adapters never upload results themselves.


Building the Docker Image

From repository root:

docker build -t man4ish/omni-tool-runtime:latest .

Verify:

docker images | grep omni-tool-runtime

Running a Tool Locally (No Cloud)

docker run --rm \
  -e TOOL_ID=echo_test \
  -e RUN_ID=local-test-1 \
  -e INPUTS_JSON='{"text":"hello world"}' \
  -e RESOURCES_JSON='{}' \
  man4ish/omni-tool-runtime:latest

Expected:

  • JSON output printed to stdout
  • No upload attempted if no result URI is provided

AWS Batch Usage

Job Definition

  • Image: man4ish/omni-tool-runtime:latest
  • Command override:
["python", "-m", "tools.echo_test.run"]

Injected Environment

  • S3_RESULT_URI provided by AwsBatchAdapter
  • IAM Role handles S3 auth

Azure Batch Usage

Task Settings

  • Image: man4ish/omni-tool-runtime:latest
  • Command:
python -m tools.echo_test.run

Injected Environment

  • RESULT_URI=azureblob://...
  • Managed Identity handles Blob auth

Pushing the Image

Docker Hub

docker push man4ish/omni-tool-runtime:latest

Azure Container Registry

az acr login --name YOUR_ACR
docker tag man4ish/omni-tool-runtime:latest YOUR_ACR.azurecr.io/omni-tool-runtime:latest
docker push YOUR_ACR.azurecr.io/omni-tool-runtime:latest

Adding a New Tool

Step 1: Create tool folder

mkdir tools/my_new_tool
touch tools/my_new_tool/__init__.py
touch tools/my_new_tool/run.py

Step 2: Implement run.py

Rules:

  • Must read env vars
  • Must write result via upload_result()
  • Must be deterministic

Step 3: Register tool in adapter config

AWS Batch

job_definition_map:
  my_new_tool: "omnibioai-my-new-tool:1"

Azure Batch

tools:
  my_new_tool:
    image: "man4ish/omni-tool-runtime:latest"
    command: ["python", "-m", "tools.my_new_tool.run"]

Current State

Implemented

  • Unified runtime image
  • AWS Batch support
  • Azure Batch support
  • S3 + Azure Blob uploads
  • Deterministic execution contract
  • Reference echo_test tool

Intentionally Missing (by design)

  • No workflow orchestration
  • No retry logic
  • No state machine
  • No scheduling policy

Planned Future Enhancements

Short-term

  • Tool generator CLI (omnibioai tool new)
  • Structured logging
  • Result size validation
  • Runtime version pinning

Medium-term

  • Kubernetes Job adapter support
  • Streaming stdout to object storage
  • Tool-level resource enforcement
  • Tool metadata introspection

Long-term

  • Signed result manifests
  • Provenance hashing
  • Deterministic replay support
  • Cross-cloud artifact mirroring

Design Philosophy (Important)

This runtime is intentionally boring.

That’s a feature.

  • No magic
  • No backend assumptions
  • No hidden orchestration
  • One job → one tool → one result

Everything complex belongs above this layer.


Final Note

If this runtime feels similar to:

  • CWL CommandLineTool
  • TES task containers
  • AWS Batch single-purpose images

That’s intentional.

You’re building the correct abstraction boundary.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

omni_tool_runtime-0.1.0.tar.gz (11.1 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

omni_tool_runtime-0.1.0-py3-none-any.whl (9.5 kB view details)

Uploaded Python 3

File details

Details for the file omni_tool_runtime-0.1.0.tar.gz.

File metadata

  • Download URL: omni_tool_runtime-0.1.0.tar.gz
  • Upload date:
  • Size: 11.1 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.13.9

File hashes

Hashes for omni_tool_runtime-0.1.0.tar.gz
Algorithm Hash digest
SHA256 7647fdc32ea07a055cd3c03fc64cd5d2bc465fb4b6acb9c23e3300cd533d13a9
MD5 6e997b8097baa516939c26420f95db5a
BLAKE2b-256 9898a84e618836ef5e52cb568a593b5437b9587429a52cde8862b115aa28ae19

See more details on using hashes here.

File details

Details for the file omni_tool_runtime-0.1.0-py3-none-any.whl.

File metadata

File hashes

Hashes for omni_tool_runtime-0.1.0-py3-none-any.whl
Algorithm Hash digest
SHA256 0ed2cb86b786fbb70825b341c67b69a8815326ffc4a6ecbea7488fd69a962ef8
MD5 55810dc620c17d0cae06707e714d5189
BLAKE2b-256 317473fe892f485bf27c338318b4fe81152c4b0bef8acc13551b5085b790e983

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page