Skip to main content

Event-sourced agent engine — CLI and Python bindings for auditable AI workflows

Project description

zymi-core

Event-sourced agent engine for auditable AI workflows in Rust, YAML, and Python.

zymi-core helps you build agent workflows you can inspect after the fact. Every run is recorded as an immutable event stream in SQLite, agent side effects are mediated through intentions and boundary contracts, and pipelines execute as DAGs with parallel steps when possible.

Highlights

  • Auditable by default: every state change is persisted as an event with hash-chain verification.
  • Safer side effects: agents emit intentions first; contracts and approvals decide what is allowed to execute.
  • Practical workflows: define agents and DAG pipelines in YAML, then run them from a small CLI.
  • Flexible integration points: use the Rust crate, Python bindings, or both — Python can drive pipelines directly via Runtime.for_project(...).run_pipeline(...), no subprocess.
  • LLM-provider ready: OpenAI-compatible providers, Anthropic support, Python tools, and LangFuse event services.

Installation

If you want to... Install with...
CLI + Python bindings pip install zymi-core
Rust crate only zymi-core = "0.1"

pip install zymi-core gives you both the zymi CLI command and the zymi_core Python module.

Quick Start

# Install
pip install zymi-core

# Create a demo project
mkdir zymi-demo
cd zymi-demo
zymi init --example research

# Add your LLM provider config to project.yml, then run the pipeline
zymi run research -i topic="event sourcing"

# Inspect what happened
zymi events --limit 20
zymi verify

For example, this is enough to get started with OpenAI:

llm:
  provider: openai
  model: gpt-4o
  api_key: ${env.OPENAI_API_KEY}

What this gives you:

  • project.yml for provider config, policies, contracts, and defaults
  • agents/ for agent definitions
  • pipelines/ for DAG workflows
  • .zymi/events.db for the append-only event log
  • output/ and memory/ directories in the research example

Common CLI commands

zymi init --name my-project
zymi init --example research

zymi run main -i task="Summarize the architecture"
zymi run research -i topic="Rust event sourcing"

# Long-running mode: react to PipelineRequested events from any process
zymi serve research

zymi events
zymi events --stream conversation-1
zymi events --kind LlmCallCompleted --json

zymi replay conversation-1 --from 1
zymi verify
zymi verify --stream conversation-1

Project Layout

A zymi project is just a directory with YAML files:

my-project/
  project.yml
  agents/
    default.yml
  pipelines/
    main.yml
  .zymi/
    events.db

The default scaffold created by zymi init is intentionally small:

# project.yml
name: my-project
version: "0.1"

defaults:
  timeout_secs: 30
  max_iterations: 10

policy:
  enabled: true
  allow: ["ls *", "cat *", "echo *"]
  deny: ["rm -rf *"]
# agents/default.yml
name: default
description: "Default agent"
tools:
  - web_search
  - read_file
  - write_memory
max_iterations: 10
# pipelines/main.yml
name: main

steps:
  - id: process
    agent: default
    task: "${inputs.task}"

input:
  type: text

output:
  step: process

Python Bindings

The same pip install zymi-core that gives you the CLI also exposes a Runtime for running pipelines directly, plus the lower-level Event, EventBus, EventStore, Subscription, and ToolRegistry primitives for custom integrations.

Run a pipeline from Python

from zymi_core import Runtime

# Loads project.yml + agents/ + pipelines/ from the given directory and
# builds the same Runtime `zymi run` and `zymi serve` use. `approval` is
# either "terminal" (fail-closed prompt on stdin, matches `zymi run`) or
# "none" (intentions tagged RequiresHumanApproval resolve to a deny).
rt = Runtime.for_project(".", approval="terminal")

result = rt.run_pipeline("research", {"topic": "rust event sourcing"})
print(result.success, result.final_output)
for step in result.step_results:
    print(step.step_id, step.iterations, step.success)

rt.bus() and rt.store() hand out Python wrappers over the runtime's own Arcs, so any subscriber you attach there sees exactly the events the handler publishes — there is no second bus over the same SQLite file.

Tool registry and event primitives

from zymi_core import ToolRegistry

registry = ToolRegistry()

@registry.tool
def search(query: str) -> str:
    return f"Results for: {query}"

result = registry.call("search", '{"query":"rust async"}')
intention_json = registry.to_intention("search", '{"query":"rust async"}')
definitions = registry.definitions()

For lower-level event primitives the same package gives you the event store and bus directly:

from zymi_core import Event, EventBus, EventStore

store = EventStore("./events.db")
bus = EventBus(store)
subscription = bus.subscribe()

event = Event(
    stream_id="conversation-1",
    kind={"type": "UserMessageReceived", "data": {
        "content": {"User": "Hello"},
        "connector": "python",
    }},
    source="python",
)

bus.publish(event)
received = subscription.try_recv()

Multi-Process Integration (Django, Celery, scripts)

The Python wrapper for EventStore opens the same SQLite file the Rust side uses. There is no second IPC channel — events written from one process are visible to every other process that opens the same store, and a long-running zymi serve picks them up via a polling tail watcher (see ADR-0012).

The canonical pattern: a web app publishes a PipelineRequested event, zymi serve runs the pipeline, and the result comes back as a PipelineCompleted event with the same correlation_id.

Terminal A — long-running Rust service:

cd my-zymi-project
zymi serve research

Terminal B — any Python process (e.g. a Django view):

import uuid
from zymi_core import Event, EventBus, EventStore

store = EventStore(".zymi/events.db")
bus = EventBus(store)

correlation_id = str(uuid.uuid4())
sub = bus.subscribe_correlation(correlation_id)

event = Event(
    stream_id=f"web-req-{correlation_id}",
    kind={"type": "PipelineRequested", "data": {
        "pipeline": "research",
        "inputs": {"topic": "rust event sourcing"},
    }},
    source="django",
)
event.with_correlation(correlation_id)
bus.publish(event)

# Block until the serve process publishes PipelineCompleted with the
# same correlation_id (timeout in seconds).
result = sub.recv(timeout_secs=300)
print(result.kind)  # {"type": "PipelineCompleted", "data": {...}}

Because the SQLite store is the single source of truth, you also get free auditing: zymi events --stream web-req-... shows everything that happened during the run, and zymi verify checks the hash chain.

Inside zymi serve the PipelineRequested → RunPipeline translation is done by EventCommandRouter (see ADR-0013). It is re-exported from zymi_core::runtime, so if you are building your own scheduler or bot adapter you can wire the same router against your own Runtime without copy-pasting cli/serve.rs.

Rust Crate

Add the crate to your Cargo.toml:

[dependencies]
zymi-core = "0.1"

Example:

use std::sync::Arc;
use zymi_core::{open_store, Event, EventBus, EventKind, Message, StoreBackend};

let store = open_store(StoreBackend::Sqlite { path: "events.db".into() })?;
let bus = EventBus::new(store.clone());

let mut rx = bus.subscribe().await;

let event = Event::new(
    "conversation-1".into(),
    EventKind::UserMessageReceived {
        content: Message::User("Hello".into()),
        connector: "cli".into(),
    },
    "cli".into(),
);

bus.publish(event).await?;
let received = rx.recv().await.unwrap();
assert_eq!(received.kind_tag(), "user_message_received");

let verified_count = store.verify_chain("conversation-1").await?;

For cross-process delivery in your own binary, spawn a StoreTailWatcher on the same store/bus — it polls for events written by other processes and fans them out into local subscribers without re-persisting them:

use std::time::Duration;
use zymi_core::StoreTailWatcher;

let watcher = StoreTailWatcher::new(store.clone(), bus.clone())
    .with_interval(Duration::from_millis(100))
    .spawn();

// ... later, on shutdown:
watcher.stop().await;

How It Works

zymi-core is built around a small set of ideas:

  1. Every meaningful state change becomes an event. The SQLite event store is the source of truth.
  2. Agents express intentions, not side effects. Intentions are evaluated against boundary contracts before execution.
  3. Pipelines are DAGs. Independent steps can run in parallel, while dependencies remain explicit.
  4. Runs stay replayable. You can inspect events, replay streams, and verify hash-chain integrity later.

Core intention types include ExecuteShellCommand, WriteFile, ReadFile, WebSearch, WebScrape, WriteMemory, SpawnSubAgent, and CallCustomTool.

Feature Flags (Rust crate)

The pip wheel ships with python and cli enabled. These flags are relevant when depending on the Rust crate directly.

Feature Description
python PyO3 bindings for the _zymi_core Python extension module
cli The zymi CLI binary
runtime Async runtime and HTTP dependencies used by runtime integrations
webhook HTTP approval handler built on Axum
services Event-bus services such as LangFuse

Development

cargo test
cargo test --features services,webhook

cargo clippy -- -D warnings
cargo clippy --features services -- -D warnings

maturin develop --features python,cli

License

MIT

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

zymi_core-0.1.4.tar.gz (142.7 kB view details)

Uploaded Source

Built Distributions

If you're not sure about the file name format, learn more about wheel file names.

zymi_core-0.1.4-cp311-cp311-win_amd64.whl (4.0 MB view details)

Uploaded CPython 3.11Windows x86-64

zymi_core-0.1.4-cp311-cp311-manylinux_2_17_aarch64.manylinux2014_aarch64.whl (3.7 MB view details)

Uploaded CPython 3.11manylinux: glibc 2.17+ ARM64

zymi_core-0.1.4-cp311-cp311-macosx_11_0_arm64.whl (3.6 MB view details)

Uploaded CPython 3.11macOS 11.0+ ARM64

zymi_core-0.1.4-cp311-cp311-macosx_10_12_x86_64.whl (3.8 MB view details)

Uploaded CPython 3.11macOS 10.12+ x86-64

zymi_core-0.1.4-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (4.0 MB view details)

Uploaded CPython 3.8manylinux: glibc 2.17+ x86-64

File details

Details for the file zymi_core-0.1.4.tar.gz.

File metadata

  • Download URL: zymi_core-0.1.4.tar.gz
  • Upload date:
  • Size: 142.7 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.13.12

File hashes

Hashes for zymi_core-0.1.4.tar.gz
Algorithm Hash digest
SHA256 2d516361c1d6ec02d86e8b41d446be8eab588b7aa1645330e0f6c8a37723a095
MD5 2cb1a3710ba023d6293b37fd9c4fd437
BLAKE2b-256 061390fb2cc98c23d9cf9a3f98a297c4726a78677644854efbe1ca1c053ded3a

See more details on using hashes here.

Provenance

The following attestation bundles were made for zymi_core-0.1.4.tar.gz:

Publisher: release.yml on metravod/zymi-core

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file zymi_core-0.1.4-cp311-cp311-win_amd64.whl.

File metadata

  • Download URL: zymi_core-0.1.4-cp311-cp311-win_amd64.whl
  • Upload date:
  • Size: 4.0 MB
  • Tags: CPython 3.11, Windows x86-64
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.13.12

File hashes

Hashes for zymi_core-0.1.4-cp311-cp311-win_amd64.whl
Algorithm Hash digest
SHA256 c101b85593174ac3167468c26d53194fe66e9a13725e10b1d38ba8bd0223e14d
MD5 53fa84134ed0caab1dcba49be1abbd20
BLAKE2b-256 5fb777a82ad5215275bad71ef610e82e0af3def58a63e446c836e28829526961

See more details on using hashes here.

Provenance

The following attestation bundles were made for zymi_core-0.1.4-cp311-cp311-win_amd64.whl:

Publisher: release.yml on metravod/zymi-core

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file zymi_core-0.1.4-cp311-cp311-manylinux_2_17_aarch64.manylinux2014_aarch64.whl.

File metadata

File hashes

Hashes for zymi_core-0.1.4-cp311-cp311-manylinux_2_17_aarch64.manylinux2014_aarch64.whl
Algorithm Hash digest
SHA256 68f388b10c9bdc3f279760fdd33976f6fd6b64b481812e4de5e6103fc599961f
MD5 ff698b05803284cd604813d12bc52e93
BLAKE2b-256 1643d5b9a02f3a0a04107c9a87d6ae4675cb16a6a8348142c758072bac24b3c8

See more details on using hashes here.

Provenance

The following attestation bundles were made for zymi_core-0.1.4-cp311-cp311-manylinux_2_17_aarch64.manylinux2014_aarch64.whl:

Publisher: release.yml on metravod/zymi-core

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file zymi_core-0.1.4-cp311-cp311-macosx_11_0_arm64.whl.

File metadata

File hashes

Hashes for zymi_core-0.1.4-cp311-cp311-macosx_11_0_arm64.whl
Algorithm Hash digest
SHA256 77fa3a719daf57b303ba0c835be666e5ab40b312e8ce0a690cf764ace37ad219
MD5 e780ecea1517d8e0045924d143107b6a
BLAKE2b-256 db57d31679635ce3f61e259bd8c3dfc4e8c717edd25f63201db77fef5f092ec3

See more details on using hashes here.

Provenance

The following attestation bundles were made for zymi_core-0.1.4-cp311-cp311-macosx_11_0_arm64.whl:

Publisher: release.yml on metravod/zymi-core

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file zymi_core-0.1.4-cp311-cp311-macosx_10_12_x86_64.whl.

File metadata

File hashes

Hashes for zymi_core-0.1.4-cp311-cp311-macosx_10_12_x86_64.whl
Algorithm Hash digest
SHA256 e0636a738744f9d00e734c459657e4d977c63f4886ed54dc64ccd2d2e3129cc0
MD5 58d4a68844235caf492f590363c5f387
BLAKE2b-256 71e624b5080e4dea9bb3003cab94b2aa503aa4115e680c5a605524a68e06d7dc

See more details on using hashes here.

Provenance

The following attestation bundles were made for zymi_core-0.1.4-cp311-cp311-macosx_10_12_x86_64.whl:

Publisher: release.yml on metravod/zymi-core

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file zymi_core-0.1.4-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl.

File metadata

File hashes

Hashes for zymi_core-0.1.4-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl
Algorithm Hash digest
SHA256 33b636b5c610f61db87a46eb613f130d41cd43353f0a2c76181836807e44ae00
MD5 de9f5352500b4315a031eb18b9cc58c7
BLAKE2b-256 d9aa1877a6f1220b5a808020e5c18a67c976b4887fe640549faeb3633088de11

See more details on using hashes here.

Provenance

The following attestation bundles were made for zymi_core-0.1.4-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl:

Publisher: release.yml on metravod/zymi-core

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page