Skip to main content

Governed AI orchestration runtime — policy-driven, fail-closed, evidence-trail

Project description

ao-kernel

Governed AI orchestration runtime — policy-driven, fail-closed, evidence-trail.

ao-kernel is not a general-purpose agent framework. It is a governed runtime that enforces policies, records evidence, and provides deterministic LLM routing for production Python teams.

Installation

pip install ao-kernel                # Core (only jsonschema dependency)
pip install ao-kernel[llm]           # LLM modules (tenacity + tiktoken)
pip install ao-kernel[mcp]           # MCP server support
pip install ao-kernel[otel]          # OpenTelemetry instrumentation
pip install ao-kernel[llm,mcp,otel]  # Everything

Requires Python 3.11+.

Quick Start

# Create workspace
ao-kernel init

# Check health
ao-kernel doctor
# Library mode (no workspace required)
from ao_kernel.config import load_default
policy = load_default("policies", "policy_autonomy.v1.json")

# LLM routing
from ao_kernel.llm import build_request, normalize_response

request = build_request(
    provider_id="openai",
    model="gpt-4",
    messages=[{"role": "user", "content": "Hello"}],
    base_url="https://api.openai.com/v1/chat/completions",
    api_key="sk-...",
)

# Streaming
from ao_kernel.llm import build_request as build_req

stream_request = build_req(
    provider_id="claude",
    model="claude-sonnet-4-20250514",
    messages=[{"role": "user", "content": "Hello"}],
    base_url="https://api.anthropic.com/v1/messages",
    api_key="sk-ant-...",
    stream=True,
)

CLI Reference

Command Description
ao-kernel init Create .ao/ workspace
ao-kernel doctor Workspace health check (8 checks)
ao-kernel migrate [--dry-run] [--backup] Version migration
ao-kernel version Print version
ao-kernel mcp serve Start MCP server (stdio)

Python API

ao_kernel.config

Function Description
workspace_root(override=None) Resolve workspace (returns None in library mode)
load_default(resource_type, filename) Load bundled JSON default
load_with_override(resource_type, filename, workspace) Workspace override > bundled default

ao_kernel.llm

Function Description
resolve_route(intent, ...) Deterministic LLM routing
build_request(provider_id, model, messages, ...) Provider-native HTTP request
normalize_response(resp_bytes, provider_id) Extract text + usage + tool_calls
extract_text(resp_bytes) Extract text from response
execute_request(url, headers, body_bytes, ...) HTTP with retry + circuit breaker
stream_request(url, headers, ...) SSE streaming with OK/PARTIAL/FAIL
get_circuit_breaker(provider_id) Per-provider circuit breaker
count_tokens(messages, provider_id, model) Token counting

Supported providers: Claude, OpenAI, Google Gemini, DeepSeek, Qwen, xAI.

MCP Server

ao-kernel runs as an MCP (Model Context Protocol) server, exposing governance tools:

ao-kernel mcp serve  # stdio transport

Tools:

  • ao_policy_check — Validate action against policy (allow/deny)
  • ao_llm_route — Resolve provider/model for intent
  • ao_quality_gate — Check output quality
  • ao_workspace_status — Workspace health

Resources:

  • ao://policies/{name} — Policy JSON
  • ao://schemas/{name} — Schema JSON
  • ao://registry/{name} — Registry JSON

What Makes ao-kernel Different

ao-kernel LangGraph CrewAI Pydantic AI
Policy engine 90+ policies No No No
Fail-closed Yes No No No
Evidence trail Self-hosted JSONL LangSmith SaaS No No
Migration CLI Yes No No No
Doctor Yes No No No
MCP server Yes No No No
Streaming SSE (6 providers) Yes Yes Yes

Architecture

ao_kernel/          <- Public facade (clean API)
  cli.py            <- CLI commands
  config.py         <- Workspace + defaults resolver
  llm.py            <- LLM routing, building, normalization
  mcp_server.py     <- MCP server (4 tools, 3 resources)
  telemetry.py      <- OpenTelemetry (lazy no-op fallback)
  defaults/         <- 324 bundled JSON (policies, schemas, registry)

src/                <- Compat shim (deprecated, use ao_kernel.*)

License

MIT

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

ao_kernel-0.1.0.tar.gz (270.5 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

ao_kernel-0.1.0-py3-none-any.whl (444.9 kB view details)

Uploaded Python 3

File details

Details for the file ao_kernel-0.1.0.tar.gz.

File metadata

  • Download URL: ao_kernel-0.1.0.tar.gz
  • Upload date:
  • Size: 270.5 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.13.12

File hashes

Hashes for ao_kernel-0.1.0.tar.gz
Algorithm Hash digest
SHA256 1fa043daa3bce5ec20ccc603b736291d7802fe9cef71ee2c0d9d6679dcfb2f2c
MD5 67421a81b84215a3e01aad0e91965b7f
BLAKE2b-256 040b3ba9cc0d57b9a2d2fe4e3d8a6f8631200e81461e5773d034c551785d1696

See more details on using hashes here.

Provenance

The following attestation bundles were made for ao_kernel-0.1.0.tar.gz:

Publisher: publish.yml on Halildeu/ao-kernel

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file ao_kernel-0.1.0-py3-none-any.whl.

File metadata

  • Download URL: ao_kernel-0.1.0-py3-none-any.whl
  • Upload date:
  • Size: 444.9 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.13.12

File hashes

Hashes for ao_kernel-0.1.0-py3-none-any.whl
Algorithm Hash digest
SHA256 04bf00cee02a1f1d8c73dec231c93e91f1b63c447da7db95ef80a1e6885213a4
MD5 de3427b0ddbadfdfa2d14b84fc15eb56
BLAKE2b-256 93dededcfee40e4d371ba9312267886f6e9253b800359ec08f6e16e8aa8b574e

See more details on using hashes here.

Provenance

The following attestation bundles were made for ao_kernel-0.1.0-py3-none-any.whl:

Publisher: publish.yml on Halildeu/ao-kernel

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page