Skip to main content

The official zero-trust, high-throughput kinetic execution engine for the coreason-manifest ontology.

Project description

🧠 coreason-runtime

PyPI version Python 3.14+ License

The official zero-trust, high-throughput kinetic execution engine for the coreason-manifest ontology.

coreason-runtime is a State-of-the-Art (SOTA) 2026 cybernetic execution engine. It abandons legacy, fragile "chain-of-thought" LLM scripting in favor of deterministic Active Inference, Topological Data Analysis (TDA), and strictly bounded Markov Decision Processes.

If coreason-manifest is the DNA of your multi-agent topologies, coreason-runtime is the biological cell that safely executes them.


🚀 The Paradigm Shift

Modern enterprise AI cannot rely on unbounded while True loops and raw Python exec(). The coreason-runtime enforces mathematical rigor at every boundary:

  • Deterministic Orchestration: Built on Temporal, Swarm executions are durably serialized. If a GPU dies or a network request fails, the Swarm pauses, rehydrates, and resumes exactly where it left off. No amnesia. No ghost processes.
  • Zero-Trust WASM Sandboxing: Kinetic actions (Tools) are executed inside isolated WebAssembly environments via Extism. Agents can execute complex IO without ever touching your host's root kernel or filesystem.
  • Epistemic Vector Ledger: Native, zero-copy integration with LanceDB. The runtime automatically projects the agent's latent state into an embedded vector memory layer.
  • Embedded Medallion Analytics: No need for heavy Spark clusters. Raw telemetry (SSE) is ingested via dlt and transformed into Silver/Gold analytical intelligence matrices using Rust-backed Polars directly inside the daemon.
  • Human-in-the-Loop (HITL) Webhooks: When an agent calculates high Variational Free Energy (epistemic uncertainty), it durably suspends its thread and emits an Oracle Request, waiting safely for a human expert to inject resolving priors via API.

⚡ Installation

We utilize uv for ultra-fast, deterministic resolution. Ensure you are running Python 3.14+.

uv add coreason-runtime

Note: For bare-metal enterprise deployment with SGLang GPU passthrough, refer to our Docker Deployment Guide.


🛠️ Quickstart

The runtime is designed to be operated via its CLI or mounted as an API edge.

1. Run a Local Swarm

To execute a mathematically verified agentic topology, simply pass the JSON/YAML manifest to the runtime:

coreason run ./my_swarm_manifest.json

2. Boot the API Edge & Telemetry Broker

To boot the runtime as a continuous daemon (exposing the CRDT State Sync, Schema Projection, and Server-Sent Events telemetry):

coreason serve --port 8000

Your frontend IDE can now connect to http://localhost:8000/api/v1/telemetry/stream to visualize the active inference loops in real-time.


🏗️ Architecture

The runtime operates across four isolated computational boundaries:

  1. The Orchestrator: Temporal Python SDK running deterministic AST-scanned workflows.
  2. The Cognitive Engine: SGLang routing for sub-millisecond constrained tensor inference.
  3. The Kinetic Sandbox: Extism executing .wasm MCP plugins.
  4. The Epistemic Store: LanceDB & Polars managing long-term vectors and ETL metrics.

For a deep dive into the cybernetic loop, read the Architecture Documentation.


📜 License

This software is proprietary and dual-licensed under the Prosperity Public License 3.0. Commercial use beyond a 30-day trial requires a separate commercial license. See the LICENSE file for details.

Copyright (c) 2026 CoReason, Inc.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

coreason_runtime-0.1.1.tar.gz (155.2 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

coreason_runtime-0.1.1-py3-none-any.whl (37.5 kB view details)

Uploaded Python 3

File details

Details for the file coreason_runtime-0.1.1.tar.gz.

File metadata

  • Download URL: coreason_runtime-0.1.1.tar.gz
  • Upload date:
  • Size: 155.2 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.13.7

File hashes

Hashes for coreason_runtime-0.1.1.tar.gz
Algorithm Hash digest
SHA256 6982596e10fccfa224cb85e6d35224ce2251b4742579bccdcb32f6f118341484
MD5 f56a4f6a858d17becab2fc286a0233f3
BLAKE2b-256 792e5b0377286b903c2b58980f35f80aedabea595e87cc6e56877ace6ed6bf92

See more details on using hashes here.

Provenance

The following attestation bundles were made for coreason_runtime-0.1.1.tar.gz:

Publisher: publish.yml on CoReason-AI/coreason-runtime

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file coreason_runtime-0.1.1-py3-none-any.whl.

File metadata

File hashes

Hashes for coreason_runtime-0.1.1-py3-none-any.whl
Algorithm Hash digest
SHA256 e73042cf50e7fe4aacd360cf366fabedcd33e9d74470c5efb7a6057900dd6417
MD5 b1107fcb43d55f82bc9a2c9f7dfba3c6
BLAKE2b-256 c5f1f80ee0b13d87e37713fe5bb6a378bb41c5906922bf0424fab4b4ab47a725

See more details on using hashes here.

Provenance

The following attestation bundles were made for coreason_runtime-0.1.1-py3-none-any.whl:

Publisher: publish.yml on CoReason-AI/coreason-runtime

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page