OmniBioAI Tool Execution Service (TES): secure, validated execution of bioinformatics tools on local/HPC/remote backends
Project description
OmniBioAI Tool Execution Service (TES)
omnibioai-tool-exec is a standalone Tool Execution Service (TES) that enables secure, validated, and reproducible execution of bioinformatics tools across heterogeneous compute environments (local, HPC, cloud, or remote servers).
It is designed to be used by agentic AI systems (such as OmniBioAI) where an LLM interprets user intent, but never executes tools directly. Instead, all execution is delegated to TES via a strict API contract.
Why TES exists
LLMs are good at understanding intent, but bad at safely running tools.
TES solves this by acting as a control plane between AI agents and real compute:
- Validates tool inputs and resource requests
- Matches tools to compatible execution environments
- Executes tools in a controlled adapter layer
- Tracks run lifecycle (submitted → running → completed/failed)
- Returns structured, machine-readable results
This prevents:
- Arbitrary command execution
- Hallucinated tool outputs
- Tight coupling between UI, AI, and infrastructure
Core Concepts
Tool
A tool is a declarative definition of:
- Inputs schema (JSON Schema)
- Outputs schema
- Requirements (e.g. BLAST database, reference genome)
Example:
tool_id: blastn
inputs_schema:
required: [sequence, database]
Server
A server represents an execution environment:
- Local machine
- HPC cluster
- Kubernetes
- Remote API-backed service
Each server advertises:
- Which tools it supports
- Available resources
- Storage constraints
- Runtime policies
Adapter
An adapter translates a validated run request into a real execution:
- Local process
- Container
- Job scheduler
- HTTP call
Adapters are pluggable and isolated from the API layer.
Architecture Overview
User / Chat UI
|
v
LLM / Agent (intent only)
|
v
omnibioai-tool-exec (TES)
- validate
- route
- execute
|
v
Execution Environment
(local / HPC / cloud)
Key rule: LLMs never execute tools. TES does.
API Overview
TES exposes a REST API (OpenAPI / FastAPI-based).
Discovery
GET /api/tools– list available toolsGET /api/servers– list available servers and capabilities
Execution
POST /api/runs/validate– validate tool + inputs + resourcesPOST /api/runs/submit– submit a runGET /api/runs/{run_id}– get run statusGET /api/runs/{run_id}/results– fetch results when ready
Example: BLAST run lifecycle
1. Validate
POST /api/runs/validate
{
"tool_id": "blastn",
"inputs": {
"sequence": ">q\nACGTACGT",
"database": "ecoli_demo"
},
"resources": { "cpu": 2, "ram_gb": 2 }
}
2. Submit
POST /api/runs/submit
→ { "run_id": "run_abc123" }
3. Poll
GET /api/runs/run_abc123
→ state: RUNNING | COMPLETED | FAILED
4. Fetch results
GET /api/runs/run_abc123/results
→ structured JSON results
Demo Adapter (Current)
The default configuration includes a local demo adapter that returns mock BLAST results.
This is intentional:
- It validates the entire execution pipeline
- It allows frontend and agent development without heavy dependencies
- It can be replaced with a real backend without changing the API
Different inputs currently return the same demo output. Replace the adapter to enable real BLAST execution.
Running the Service
Requirements
- Python ≥ 3.11
piporpipx
Install (editable)
pip install -e .
Run
omnibioai-tes serve \
--host 127.0.0.1 \
--port 8080 \
--tools configs/tools.example.yaml \
--servers configs/servers.example.yaml
API Docs
Open:
http://127.0.0.1:8080/docs
Configuration
Tools
Defined via YAML:
- tool_id: blastn
display_name: BLASTN
inputs_schema:
required: [sequence, database]
Servers
Defined via YAML:
- server_id: local_demo
adapter_type: local
capabilities:
tools:
- tool_id: blastn
features:
databases: [ecoli_demo]
Integration with OmniBioAI
TES is designed to be used server-to-server.
Typical flow:
- User chats with OmniBioAI
- LLM identifies a tool intent (
blast:) - Django backend calls TES
- TES executes and returns results
- Results are rendered in chat UI
No browser-to-TES calls are required.
What TES is not
- ❌ A workflow engine (Nextflow/WDL belong elsewhere)
- ❌ A UI
- ❌ An LLM executor
- ❌ A database
TES is a secure execution boundary.
Roadmap
- Real BLAST+ adapters (local / container / HPC)
- OMIM / annotation tools
- Async callbacks & long-running job tracking
- Auth / multi-tenant policies
- Artifact storage backends (S3, GCS, POSIX)
License
Apache 2.0 (or your chosen license)
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file omnibioai_tool_exec-0.1.1.tar.gz.
File metadata
- Download URL: omnibioai_tool_exec-0.1.1.tar.gz
- Upload date:
- Size: 22.0 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.2.0 CPython/3.13.9
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
6c0f921b3eaa23a6f9a797e216835a80ea40c18f047fdfefce370dd41591a320
|
|
| MD5 |
e8d29492ea1aded5be0a373096a98022
|
|
| BLAKE2b-256 |
cf1ebfb7d36cdb9d59e11ce5d7e92ffb60370408de32e0663c90bf367fcb47d2
|
File details
Details for the file omnibioai_tool_exec-0.1.1-py3-none-any.whl.
File metadata
- Download URL: omnibioai_tool_exec-0.1.1-py3-none-any.whl
- Upload date:
- Size: 32.3 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.2.0 CPython/3.13.9
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
510d7817c11a7fe61942c3d9bae63bb599d9e5a4ac9526dd1202e94248d5c16c
|
|
| MD5 |
3f9ae32b4f957e15c8eb82a8df646d6d
|
|
| BLAKE2b-256 |
36dcb9ac2ce37461685caf803ca9845ac0d0b1b2225136dc372c25ffae72a8dd
|