Governed AI orchestration runtime — policy-driven, fail-closed, evidence-trail
Project description
ao-kernel
Governed AI orchestration runtime — policy-driven, fail-closed, evidence-trail.
ao-kernel is not a general-purpose agent framework. It is a governed runtime that enforces policies, records evidence, and provides deterministic LLM routing for production Python teams.
Installation
pip install ao-kernel # Core (only jsonschema dependency)
pip install ao-kernel[llm] # LLM modules (tenacity + tiktoken)
pip install ao-kernel[mcp] # MCP server support
pip install ao-kernel[otel] # OpenTelemetry instrumentation
pip install ao-kernel[llm,mcp,otel] # Everything
Requires Python 3.11+.
Quick Start
# Create workspace
ao-kernel init
# Check health
ao-kernel doctor
# Library mode (no workspace required)
from ao_kernel.config import load_default
policy = load_default("policies", "policy_autonomy.v1.json")
# LLM routing
from ao_kernel.llm import build_request, normalize_response
request = build_request(
provider_id="openai",
model="gpt-4",
messages=[{"role": "user", "content": "Hello"}],
base_url="https://api.openai.com/v1/chat/completions",
api_key="sk-...",
)
# Streaming
from ao_kernel.llm import build_request as build_req
stream_request = build_req(
provider_id="claude",
model="claude-sonnet-4-20250514",
messages=[{"role": "user", "content": "Hello"}],
base_url="https://api.anthropic.com/v1/messages",
api_key="sk-ant-...",
stream=True,
)
CLI Reference
| Command | Description |
|---|---|
ao-kernel init |
Create .ao/ workspace |
ao-kernel doctor |
Workspace health check (8 checks) |
ao-kernel migrate [--dry-run] [--backup] |
Version migration |
ao-kernel version |
Print version |
ao-kernel mcp serve |
Start MCP server (stdio) |
Python API
ao_kernel.config
| Function | Description |
|---|---|
workspace_root(override=None) |
Resolve workspace (returns None in library mode) |
load_default(resource_type, filename) |
Load bundled JSON default |
load_with_override(resource_type, filename, workspace) |
Workspace override > bundled default |
ao_kernel.llm
| Function | Description |
|---|---|
resolve_route(intent, ...) |
Deterministic LLM routing |
build_request(provider_id, model, messages, ...) |
Provider-native HTTP request |
normalize_response(resp_bytes, provider_id) |
Extract text + usage + tool_calls |
extract_text(resp_bytes) |
Extract text from response |
execute_request(url, headers, body_bytes, ...) |
HTTP with retry + circuit breaker |
stream_request(url, headers, ...) |
SSE streaming with OK/PARTIAL/FAIL |
get_circuit_breaker(provider_id) |
Per-provider circuit breaker |
count_tokens(messages, provider_id, model) |
Token counting |
Supported providers: Claude, OpenAI, Google Gemini, DeepSeek, Qwen, xAI.
MCP Server
ao-kernel runs as an MCP (Model Context Protocol) server, exposing governance tools:
ao-kernel mcp serve # stdio transport
Tools:
ao_policy_check— Validate action against policy (allow/deny)ao_llm_route— Resolve provider/model for intentao_quality_gate— Check output qualityao_workspace_status— Workspace health
Resources:
ao://policies/{name}— Policy JSONao://schemas/{name}— Schema JSONao://registry/{name}— Registry JSON
What Makes ao-kernel Different
| ao-kernel | LangGraph | CrewAI | Pydantic AI | |
|---|---|---|---|---|
| Policy engine | 90+ policies | No | No | No |
| Fail-closed | Yes | No | No | No |
| Evidence trail | Self-hosted JSONL | LangSmith SaaS | No | No |
| Migration CLI | Yes | No | No | No |
| Doctor | Yes | No | No | No |
| MCP server | Yes | No | No | No |
| Streaming | SSE (6 providers) | Yes | Yes | Yes |
Architecture
ao_kernel/ <- Public facade (clean API)
cli.py <- CLI commands
config.py <- Workspace + defaults resolver
llm.py <- LLM routing, building, normalization
mcp_server.py <- MCP server (4 tools, 3 resources)
telemetry.py <- OpenTelemetry (lazy no-op fallback)
defaults/ <- 324 bundled JSON (policies, schemas, registry)
src/ <- Compat shim (deprecated, use ao_kernel.*)
License
MIT
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file ao_kernel-0.1.0.tar.gz.
File metadata
- Download URL: ao_kernel-0.1.0.tar.gz
- Upload date:
- Size: 270.5 kB
- Tags: Source
- Uploaded using Trusted Publishing? Yes
- Uploaded via: twine/6.1.0 CPython/3.13.12
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
1fa043daa3bce5ec20ccc603b736291d7802fe9cef71ee2c0d9d6679dcfb2f2c
|
|
| MD5 |
67421a81b84215a3e01aad0e91965b7f
|
|
| BLAKE2b-256 |
040b3ba9cc0d57b9a2d2fe4e3d8a6f8631200e81461e5773d034c551785d1696
|
Provenance
The following attestation bundles were made for ao_kernel-0.1.0.tar.gz:
Publisher:
publish.yml on Halildeu/ao-kernel
-
Statement:
-
Statement type:
https://in-toto.io/Statement/v1 -
Predicate type:
https://docs.pypi.org/attestations/publish/v1 -
Subject name:
ao_kernel-0.1.0.tar.gz -
Subject digest:
1fa043daa3bce5ec20ccc603b736291d7802fe9cef71ee2c0d9d6679dcfb2f2c - Sigstore transparency entry: 1282888570
- Sigstore integration time:
-
Permalink:
Halildeu/ao-kernel@befda9db0286d6392e95452ce1c76796ed9593b7 -
Branch / Tag:
refs/tags/v0.1.0 - Owner: https://github.com/Halildeu
-
Access:
public
-
Token Issuer:
https://token.actions.githubusercontent.com -
Runner Environment:
github-hosted -
Publication workflow:
publish.yml@befda9db0286d6392e95452ce1c76796ed9593b7 -
Trigger Event:
push
-
Statement type:
File details
Details for the file ao_kernel-0.1.0-py3-none-any.whl.
File metadata
- Download URL: ao_kernel-0.1.0-py3-none-any.whl
- Upload date:
- Size: 444.9 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? Yes
- Uploaded via: twine/6.1.0 CPython/3.13.12
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
04bf00cee02a1f1d8c73dec231c93e91f1b63c447da7db95ef80a1e6885213a4
|
|
| MD5 |
de3427b0ddbadfdfa2d14b84fc15eb56
|
|
| BLAKE2b-256 |
93dededcfee40e4d371ba9312267886f6e9253b800359ec08f6e16e8aa8b574e
|
Provenance
The following attestation bundles were made for ao_kernel-0.1.0-py3-none-any.whl:
Publisher:
publish.yml on Halildeu/ao-kernel
-
Statement:
-
Statement type:
https://in-toto.io/Statement/v1 -
Predicate type:
https://docs.pypi.org/attestations/publish/v1 -
Subject name:
ao_kernel-0.1.0-py3-none-any.whl -
Subject digest:
04bf00cee02a1f1d8c73dec231c93e91f1b63c447da7db95ef80a1e6885213a4 - Sigstore transparency entry: 1282888573
- Sigstore integration time:
-
Permalink:
Halildeu/ao-kernel@befda9db0286d6392e95452ce1c76796ed9593b7 -
Branch / Tag:
refs/tags/v0.1.0 - Owner: https://github.com/Halildeu
-
Access:
public
-
Token Issuer:
https://token.actions.githubusercontent.com -
Runner Environment:
github-hosted -
Publication workflow:
publish.yml@befda9db0286d6392e95452ce1c76796ed9593b7 -
Trigger Event:
push
-
Statement type: