Skip to main content

The official zero-trust, high-throughput kinetic execution engine for the coreason-manifest ontology.

Project description

🧠 coreason-runtime

PyPI - Version CI PyPI - Python Version Downloads License: Prosperity 3.0 SOTA: 2026
Code Coverage Checked with mypy Code style: ruff pre-commit Security: Bandit
uv Forks Powered By: AI

The official zero-trust, high-throughput kinetic execution engine for the coreason-manifest ontology.

coreason-runtime is a State-of-the-Art (SOTA) 2026 cybernetic execution engine. It abandons legacy, fragile "chain-of-thought" LLM scripting in favor of deterministic Active Inference, Topological Data Analysis (TDA), and strictly bounded Markov Decision Processes.

If coreason-manifest is the DNA of your multi-agent topologies, coreason-runtime is the biological cell that safely executes them.


🚀 The Paradigm Shift

Modern enterprise AI cannot rely on unbounded while True loops and raw Python exec(). The coreason-runtime enforces mathematical rigor at every boundary:

  • Deterministic Orchestration: Built on Temporal, Swarm executions are durably serialized. If a GPU dies or a network request fails, the Swarm pauses, rehydrates, and resumes exactly where it left off. No amnesia. No ghost processes.
  • Zero-Trust WASM Sandboxing: Kinetic actions (Tools) are executed inside isolated WebAssembly environments via Extism. Agents can execute complex IO without ever touching your host's root kernel or filesystem.
  • Epistemic Vector Ledger: Native, zero-copy integration with LanceDB. The runtime automatically projects the agent's latent state into an embedded vector memory layer.
  • Embedded Medallion Analytics: No need for heavy Spark clusters. Raw telemetry (SSE) is ingested via dlt and transformed into Silver/Gold analytical intelligence matrices using Rust-backed Polars directly inside the daemon.
  • Human-in-the-Loop (HITL) Webhooks: When an agent calculates high Variational Free Energy (epistemic uncertainty), it durably suspends its thread and emits an Oracle Request, waiting safely for a human expert to inject resolving priors via API.

⚡ Installation

We utilize uv for ultra-fast, deterministic resolution. Ensure you are running Python 3.14+.

uv add coreason-runtime

Note: For bare-metal enterprise deployment with SGLang GPU passthrough, refer to our Docker Deployment Guide.


🛠️ Quickstart

The runtime is designed to be operated via its CLI or mounted as an API edge.

1. Run a Local Swarm

To execute a mathematically verified agentic topology, simply pass the JSON/YAML manifest to the runtime:

coreason run ./my_swarm_manifest.json

2. Boot the API Edge & Telemetry Broker

To boot the runtime as a continuous daemon (exposing the CRDT State Sync, Schema Projection, and Server-Sent Events telemetry):

coreason serve --port 8000

Your frontend IDE can now connect to http://localhost:8000/api/v1/telemetry/stream to visualize the active inference loops in real-time.


🏗️ Architecture

The runtime operates across four isolated computational boundaries:

  1. The Orchestrator: Temporal Python SDK running deterministic AST-scanned workflows.
  2. The Cognitive Engine: SGLang routing for sub-millisecond constrained tensor inference.
  3. The Kinetic Sandbox: Extism executing .wasm MCP plugins.
  4. The Epistemic Store: LanceDB & Polars managing long-term vectors and ETL metrics.

For a deep dive into the cybernetic loop, read the Architecture Documentation.


📜 License

This software is proprietary and dual-licensed under the Prosperity Public License 3.0. Commercial use beyond a 30-day trial requires a separate commercial license. See the LICENSE file for details.

Copyright (c) 2026 CoReason, Inc.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

coreason_runtime-0.5.0.tar.gz (1.2 MB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

coreason_runtime-0.5.0-py3-none-any.whl (306.0 kB view details)

Uploaded Python 3

File details

Details for the file coreason_runtime-0.5.0.tar.gz.

File metadata

  • Download URL: coreason_runtime-0.5.0.tar.gz
  • Upload date:
  • Size: 1.2 MB
  • Tags: Source
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.13.12

File hashes

Hashes for coreason_runtime-0.5.0.tar.gz
Algorithm Hash digest
SHA256 3a6b0001ea0925a7be6572389864199ab21e520fa76f44a356592d4b834fdf07
MD5 64e82fefb380f0eaf31fe9ea69ecaa50
BLAKE2b-256 a42119a3b19f331fe6bc481095c21948a3c0446f797a8a2d09234c5e222ac881

See more details on using hashes here.

Provenance

The following attestation bundles were made for coreason_runtime-0.5.0.tar.gz:

Publisher: publish.yml on CoReason-AI/coreason-runtime

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file coreason_runtime-0.5.0-py3-none-any.whl.

File metadata

File hashes

Hashes for coreason_runtime-0.5.0-py3-none-any.whl
Algorithm Hash digest
SHA256 ef4997f8db32fbc857a628e315220330a83a9fbb12de4dc994ca499733004a4b
MD5 d7d0baf0fd269806e57f71f6233debbc
BLAKE2b-256 924a68369fd82b36fa2d2c8e269363a95ebbba9b4f6502aa041ca430fdd1b773

See more details on using hashes here.

Provenance

The following attestation bundles were made for coreason_runtime-0.5.0-py3-none-any.whl:

Publisher: publish.yml on CoReason-AI/coreason-runtime

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page