Portable tool runtime for OmniBioAI TES (RESULT_URI uploader for s3:// and azureblob://)
Project description
Omni Tool Runtime
Portable, cloud-agnostic execution runtime for OmniBioAI tools
Overview
omnibioai-tool-runtime is a minimal, deterministic execution runtime used by
OmniBioAI’s Tool Execution Service (TES) to run individual tools across multiple execution backends, including:
- Local Docker execution
- AWS Batch
- Azure Batch
- (future) Kubernetes Jobs
- (future) TES-compatible HPC schedulers
The runtime provides a strict execution contract so that:
- TES adapters stay thin and backend-specific
- Tool containers remain portable and backend-agnostic
- Results are uploaded consistently (S3 / Azure Blob / future backends)
This mirrors the design philosophy used throughout OmniBioAI: separate orchestration from execution, and execution from logic.
What This Runtime Is (and Is Not)
✅ This runtime is
-
A containerized tool launcher
-
Responsible for:
- Reading tool inputs from environment variables
- Executing tool logic
- Writing
results.json - Uploading results to cloud storage
-
Cloud-agnostic (AWS / Azure supported today)
❌ This runtime is not
- A workflow engine
- A scheduler
- An LLM executor
- A UI layer
Those responsibilities live elsewhere in OmniBioAI.
Execution Contract (Critical)
All tools executed via omnibioai-tool-runtime must follow this contract.
Environment Variables (Injected by TES Adapter)
| Variable | Description |
|---|---|
TOOL_ID |
Tool identifier (echo_test, blastn, etc.) |
RUN_ID |
Unique run ID (generated by adapter) |
INPUTS_JSON |
JSON-encoded tool inputs |
RESOURCES_JSON |
JSON-encoded resource request |
S3_RESULT_URI |
(AWS Batch) S3 URI to upload results |
RESULT_URI |
(Azure Batch) azureblob:// URI to upload results |
Only one of S3_RESULT_URI or RESULT_URI is expected per run.
Repository Structure
omnibioai-tool-runtime/
├── Dockerfile
├── README.md
├── pyproject.toml
├── omni_tool_runtime/
│ ├── __init__.py
│ ├── result_uri.py # URI parsing & dispatch
│ ├── upload_result.py # Unified upload logic
│ └── uploaders/
│ ├── s3_uploader.py
│ └── azureblob_uploader.py
├── tools/
│ └── echo_test/
│ ├── __init__.py
│ └── run.py
└── tests/
Example Tool: echo_test
This is the reference implementation for all future tools.
Behavior
- Reads
INPUTS_JSON - Echoes a value
- Writes
results.json - Uploads results to configured storage backend
Minimal tool implementation
# tools/echo_test/run.py
import json
import os
from omni_tool_runtime.upload_result import upload_result
def main():
tool_id = os.environ["TOOL_ID"]
run_id = os.environ["RUN_ID"]
inputs = json.loads(os.environ.get("INPUTS_JSON", "{}"))
text = inputs.get("text", "")
result = {
"ok": True,
"tool_id": tool_id,
"run_id": run_id,
"results": {"echo": text},
}
upload_result(result)
if __name__ == "__main__":
main()
How Results Upload Works
upload_result() automatically detects the backend:
| Backend | URI Example |
|---|---|
| AWS | s3://bucket/prefix/run_id/results.json |
| Azure | azureblob://account/container/path/results.json |
The runtime:
- Serializes result as JSON
- Uploads to correct backend
- Prints result to stdout (for debugging)
Adapters never upload results themselves.
Building the Docker Image
From repository root:
docker build -t man4ish/omnibioai-tool-runtime:latest .
Verify:
docker images | grep omnibioai-tool-runtime
Running a Tool Locally (No Cloud)
docker run --rm \
-e TOOL_ID=echo_test \
-e RUN_ID=local-test-1 \
-e INPUTS_JSON='{"text":"hello world"}' \
-e RESOURCES_JSON='{}' \
man4ish/omnibioai-tool-runtime:latest
Expected:
- JSON output printed to stdout
- No upload attempted if no result URI is provided
AWS Batch Usage
Job Definition
- Image:
man4ish/omnibioai-tool-runtime:latest - Command override:
["python", "-m", "tools.echo_test.run"]
Injected Environment
S3_RESULT_URIprovided byAwsBatchAdapter- IAM Role handles S3 auth
Azure Batch Usage
Task Settings
- Image:
man4ish/omnibioai-tool-runtime:latest - Command:
python -m tools.echo_test.run
Injected Environment
RESULT_URI=azureblob://...- Managed Identity handles Blob auth
Pushing the Image
Docker Hub
docker push man4ish/omnibioai-tool-runtime:latest
Azure Container Registry
az acr login --name YOUR_ACR
docker tag man4ish/omnibioai-tool-runtime:latest YOUR_ACR.azurecr.io/omnibioai-tool-runtime:latest
docker push YOUR_ACR.azurecr.io/omnibioai-tool-runtime:latest
Adding a New Tool
Step 1: Create tool folder
mkdir tools/my_new_tool
touch tools/my_new_tool/__init__.py
touch tools/my_new_tool/run.py
Step 2: Implement run.py
Rules:
- Must read env vars
- Must write result via
upload_result() - Must be deterministic
Step 3: Register tool in adapter config
AWS Batch
job_definition_map:
my_new_tool: "omnibioai-my-new-tool:1"
Azure Batch
tools:
my_new_tool:
image: "man4ish/omnibioai-tool-runtime:latest"
command: ["python", "-m", "tools.my_new_tool.run"]
Current State
Implemented
- Unified runtime image
- AWS Batch support
- Azure Batch support
- S3 + Azure Blob uploads
- Deterministic execution contract
- Reference
echo_testtool
Intentionally Missing (by design)
- No workflow orchestration
- No retry logic
- No state machine
- No scheduling policy
Planned Future Enhancements
Short-term
- Tool generator CLI (
omnibioai tool new) - Structured logging
- Result size validation
- Runtime version pinning
Medium-term
- Kubernetes Job adapter support
- Streaming stdout to object storage
- Tool-level resource enforcement
- Tool metadata introspection
Long-term
- Signed result manifests
- Provenance hashing
- Deterministic replay support
- Cross-cloud artifact mirroring
Design Philosophy (Important)
This runtime is intentionally boring.
That’s a feature.
- No magic
- No backend assumptions
- No hidden orchestration
- One job → one tool → one result
Everything complex belongs above this layer.
Final Note
If this runtime feels similar to:
- CWL CommandLineTool
- TES task containers
- AWS Batch single-purpose images
That’s intentional.
You’re building the correct abstraction boundary.
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file omnibioai_tool_runtime-0.1.0.tar.gz.
File metadata
- Download URL: omnibioai_tool_runtime-0.1.0.tar.gz
- Upload date:
- Size: 6.7 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.2.0 CPython/3.13.9
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
f5c217ee489eda79bda1805e61ae38660fd3d4780591a5a850529a41c25b7453
|
|
| MD5 |
237d0f94789b6581be14f9cb99bba89c
|
|
| BLAKE2b-256 |
58840dae94a5a198fecf584f82a0bf7dda56e1297c52be04e6a21226ffc8b920
|
File details
Details for the file omnibioai_tool_runtime-0.1.0-py3-none-any.whl.
File metadata
- Download URL: omnibioai_tool_runtime-0.1.0-py3-none-any.whl
- Upload date:
- Size: 5.4 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.2.0 CPython/3.13.9
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
241c0b23b315f608c952355fca966d83132b8efbc3824ac27085531b4b24ca49
|
|
| MD5 |
d134d42675e1cfe3b121c1700abc61cc
|
|
| BLAKE2b-256 |
3d72f6c7bd8af7c4e97ea12afc154667d33396fa2a14cca4c1bd835f921db6a1
|