OAT — On-demand Agent Tooling. LLMs synthesize and execute one-shot tools with mandatory human approval.
Project description
LLMs don't pick from a menu. They cook what they need.
Website • Docs • AgenticWork Platform • Discussions
What is OAT?
OAT (On-demand Agent Tooling) lets LLMs synthesize tools on-the-fly instead of relying on pre-built tool libraries. You describe what you need in plain English. The LLM writes an async Python function, self-assesses risk, and presents it for your approval. You review the code and approve or deny. Approved tools execute in a sandbox and are discarded after use.
No MCP server to install. No schema to maintain. No tool registry to manage.
See it in action
Cloud LLM — AgenticWork API
OAT's default provider hits the AgenticWork platform's model router. No Anthropic key needed.
Self-hosted LLM — Ollama (air-gapped)
Point OAT at your own Ollama instance. Works behind firewalls, VPNs, fully disconnected networks.
Cloud infrastructure — GCP, AWS, Azure
Synthesize cloud tools using ambient credentials. No SDK install, no config files.
How it works
Intent → Capabilities → LLM Synthesis → Human Approval → Sandbox Execution → Discard
- You describe what you want in natural language
- OAT resolves capabilities — which APIs, services, and credentials are available
- The LLM writes an async Python function tailored to your request, using only the capabilities you've enabled
- You review everything — the code, risk level, explanation, requested scopes — then approve or deny
- Approved tools execute in a sandbox with scoped credentials and a timeout
- Tools are discarded after use — no schema debt, no zombie tools, no tool registry bloat
The human-in-the-loop gate is mandatory and cannot be bypassed. Every synthesized tool is reviewed before execution.
Quick start
pip install oat-ai
CLI
# Default provider (AgenticWork API)
export AGENTICWORK_API_KEY=your-key
oat synth "list all S3 buckets in my AWS account"
# Dry run — see the synthesized code without executing
oat synth "get my AWS bill for this month" --dry-run
# Limit capabilities
oat synth "fetch the weather for NYC" -c http --dry-run
# Use Anthropic directly
oat synth "find open GitHub issues labeled bug" --provider anthropic
# Use Ollama on a remote server
oat synth "check disk usage" --provider ollama --base-url http://hal:11434 --model qwen2.5:32b
# Use AWS Bedrock
oat synth "list my EC2 instances" --provider bedrock
MCP Server (Claude Code integration)
Add to .mcp.json in your project root:
{
"mcpServers": {
"oat": {
"command": "oat",
"args": ["mcp", "serve"],
"env": {
"SYNTH_PROVIDER": "agenticwork",
"AGENTICWORK_API_KEY": "your-key"
}
}
}
}
Claude Code can then synthesize and execute tools on demand through the MCP protocol.
Python library
import asyncio
from oats import CapabilityRegistry, Synthesizer, Executor, HITLGate
from oats.core.llm import create_llm_client
from oats.hitl.gate import CLIApprovalHandler
async def main():
registry = CapabilityRegistry()
registry.register_builtin("http", "github", "aws")
client = create_llm_client("agenticwork", api_key="your-key")
synthesizer = Synthesizer(llm_client=client, capability_registry=registry)
tool = await synthesizer.synthesize("get my AWS costs for the last 7 days by service")
gate = HITLGate(handler=CLIApprovalHandler())
decision = await gate.submit_for_approval(tool)
if decision.approved:
output = await Executor().execute(tool)
print(output.result)
asyncio.run(main())
Supported LLM providers
| Provider | Config | Notes |
|---|---|---|
| AgenticWork (default) | --provider agenticwork |
Platform model router, AGENTICWORK_API_KEY |
| Anthropic | --provider anthropic |
Claude models, ANTHROPIC_API_KEY |
| AWS Bedrock | --provider bedrock |
Claude on AWS, uses IAM credentials |
| Ollama | --provider ollama --base-url http://host:11434 |
Local/self-hosted, any GGUF model |
| OpenAI-compatible | --provider openai --base-url https://your-api.com |
vLLM, LocalAI, Azure OpenAI, etc. |
Built-in capabilities
| Capability | What it provides |
|---|---|
http |
HTTP requests to any API (httpx) |
github |
GitHub REST API — repos, issues, PRs, notifications |
slack |
Slack Web API — messages, channels, users |
aws |
AWS via boto3 — S3, EC2, Lambda, Cost Explorer, CloudWatch |
gcp |
Google Cloud — Storage, BigQuery, Compute, Billing |
azure |
Azure — Blob Storage, Cosmos DB, Key Vault, Functions |
filesystem |
Read/write local files (pathlib) |
shell |
Run shell commands (async subprocess) |
json |
Parse/transform JSON |
datetime |
Date/time with timezone support |
data |
Sort, filter, group, aggregate |
Capabilities are defined in YAML. Add your own for internal APIs, databases, or any service:
capabilities:
- name: myapi
description: Access the internal Acme API for order management
auth:
type: bearer
token_env_var: ACME_API_TOKEN
allowed_domains:
- api.acme.internal
What can OAT replace?
| Instead of installing... | Just say... |
|---|---|
| A GitHub MCP server | oat synth "show my open PRs with failing CI" |
| An AWS cost tool | oat synth "get AWS spending for the last 30 days by service" |
| A GCP storage client | oat synth "list all GCS buckets and their sizes" |
| A Jira integration | oat synth "find overdue tickets assigned to me" |
| A Slack bot | oat synth "post deploy summary to #engineering" |
| A custom API wrapper | oat synth "call the Acme API and list active orders" |
AgenticWork Platform
OAT is the open-source engine behind the AgenticWork Platform. The platform adds:
- One-click OAuth — connect GitHub, AWS, GCP, Azure, Slack, Jira through your browser
- Credential vault — encrypted, scoped, auto-rotated tokens
- Web approval UI — review and approve tools with one click
- Server-side sandbox — isolated container execution on managed infra
- Team access controls — role-based permissions across your org
- Audit log — every synthesis, approval, and execution is recorded
Try the AgenticWork Platform →
Contributing
git clone https://github.com/agentic-work/oats.git
cd oats
pip install -e ".[dev]"
pytest # Run tests
mypy oats/ --ignore-missing-imports # Type check
ruff check oats/ # Lint
All three must pass. See CONTRIBUTING.md for guidelines.
License
MIT — see LICENSE
OAT — On-demand Agent Tooling
The open-source engine behind AgenticWork
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file oat_ai-0.6.1.tar.gz.
File metadata
- Download URL: oat_ai-0.6.1.tar.gz
- Upload date:
- Size: 4.5 MB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.2.0 CPython/3.12.3
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
39db7e081175dc9c6b16404747345ac3d2484b68d6809afdd0509696964207d7
|
|
| MD5 |
044d8de30f275640b1cc4b22ea27e56e
|
|
| BLAKE2b-256 |
2bcd922daec100d123b33f76347d517623b4bddcb24081b08831f5cbfd3355f6
|
File details
Details for the file oat_ai-0.6.1-py3-none-any.whl.
File metadata
- Download URL: oat_ai-0.6.1-py3-none-any.whl
- Upload date:
- Size: 44.7 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.2.0 CPython/3.12.3
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
af8db1479b113a1426722a9e3cd9a6eaa094c349531fc91b8fcff5e20f1aa30f
|
|
| MD5 |
874516b6734dd440e2e0d257f03a2b18
|
|
| BLAKE2b-256 |
b62d25fe20b75c3d22a5ef88118f6bb2fbad6109d6b00cf98bd348c93a8db9f1
|