Skip to main content

The official zero-trust, high-throughput kinetic execution engine for the coreason-manifest ontology.

Project description

🧠 coreason-runtime

PyPI - Version CI PyPI - Python Version Downloads License: Prosperity 3.0 SOTA: 2026
Code Coverage Checked with mypy Code style: ruff pre-commit Security: Bandit
uv Forks Powered By: AI

The official zero-trust, high-throughput kinetic execution engine for the coreason-manifest ontology.

coreason-runtime is a State-of-the-Art (SOTA) 2026 cybernetic execution engine. It abandons legacy, fragile "chain-of-thought" LLM scripting in favor of deterministic Active Inference, Topological Data Analysis (TDA), and strictly bounded Markov Decision Processes.

If coreason-manifest is the DNA of your multi-agent topologies, coreason-runtime is the biological cell that safely executes them.


🚀 The Paradigm Shift

Modern enterprise AI cannot rely on unbounded while True loops and raw Python exec(). The coreason-runtime enforces mathematical rigor at every boundary:

  • Deterministic Orchestration: Built on Temporal, Swarm executions are durably serialized. If a GPU dies or a network request fails, the Swarm pauses, rehydrates, and resumes exactly where it left off. No amnesia. No ghost processes.
  • Zero-Trust WASM Sandboxing: Kinetic actions (Tools) are executed inside isolated WebAssembly environments via Extism. Agents can execute complex IO without ever touching your host's root kernel or filesystem.
  • Epistemic Vector Ledger: Native, zero-copy integration with LanceDB. The runtime automatically projects the agent's latent state into an embedded vector memory layer.
  • Embedded Medallion Analytics: No need for heavy Spark clusters. Raw telemetry (SSE) is ingested via dlt and transformed into Silver/Gold analytical intelligence matrices using Rust-backed Polars directly inside the daemon.
  • Human-in-the-Loop (HITL) Webhooks: When an agent calculates high Variational Free Energy (epistemic uncertainty), it durably suspends its thread and emits an Oracle Request, waiting safely for a human expert to inject resolving priors via API.

⚡ Installation

We utilize uv for ultra-fast, deterministic resolution. Ensure you are running Python 3.14+.

uv add coreason-runtime

Note: For bare-metal enterprise deployment with SGLang GPU passthrough, refer to our Docker Deployment Guide.


🛠️ Quickstart

The runtime is designed to be operated via its CLI or mounted as an API edge.

1. Run a Local Swarm

To execute a mathematically verified agentic topology, simply pass the JSON/YAML manifest to the runtime:

coreason run ./my_swarm_manifest.json

2. Boot the API Edge & Telemetry Broker

To boot the runtime as a continuous daemon (exposing the CRDT State Sync, Schema Projection, and Server-Sent Events telemetry):

coreason serve --port 8000

Your frontend IDE can now connect to http://localhost:8000/api/v1/telemetry/stream to visualize the active inference loops in real-time.


🏗️ Architecture

The runtime operates across four isolated computational boundaries:

  1. The Orchestrator: Temporal Python SDK running deterministic AST-scanned workflows.
  2. The Cognitive Engine: SGLang routing for sub-millisecond constrained tensor inference.
  3. The Kinetic Sandbox: Extism executing .wasm MCP plugins.
  4. The Epistemic Store: LanceDB & Polars managing long-term vectors and ETL metrics.

For a deep dive into the cybernetic loop, read the Architecture Documentation.


📜 License

This software is proprietary and dual-licensed under the Prosperity Public License 3.0. Commercial use beyond a 30-day trial requires a separate commercial license. See the LICENSE file for details.

Copyright (c) 2026 CoReason, Inc.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

coreason_runtime-0.3.0.tar.gz (172.2 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

coreason_runtime-0.3.0-py3-none-any.whl (42.8 kB view details)

Uploaded Python 3

File details

Details for the file coreason_runtime-0.3.0.tar.gz.

File metadata

  • Download URL: coreason_runtime-0.3.0.tar.gz
  • Upload date:
  • Size: 172.2 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.13.7

File hashes

Hashes for coreason_runtime-0.3.0.tar.gz
Algorithm Hash digest
SHA256 ce1dd2f3c897f49c771c316a3bba6912fca3054c8b5e2d29a30bbfe35b871070
MD5 ed262f36770ab08610c4b19c6d47b0e5
BLAKE2b-256 fed335348e7dc3c88d01c0f52bb4c9a6eef1972f95895c34045e5cd2ba364d7c

See more details on using hashes here.

Provenance

The following attestation bundles were made for coreason_runtime-0.3.0.tar.gz:

Publisher: publish.yml on CoReason-AI/coreason-runtime

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file coreason_runtime-0.3.0-py3-none-any.whl.

File metadata

File hashes

Hashes for coreason_runtime-0.3.0-py3-none-any.whl
Algorithm Hash digest
SHA256 5746e1b5e6b5006f14cbf4616fd2a37da9a02d46721fe0ffca10367270cabf37
MD5 90e6c1ea105aa4f2527ef5b86289f233
BLAKE2b-256 90d6c77fc5e62f8fbc1f4cc1ab4343e207265ae41261be589a4eecd96c8b74fe

See more details on using hashes here.

Provenance

The following attestation bundles were made for coreason_runtime-0.3.0-py3-none-any.whl:

Publisher: publish.yml on CoReason-AI/coreason-runtime

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page