Skip to main content

Event-sourced agent engine — CLI and Python bindings for auditable AI workflows

Project description

zymi-core

Event-sourced agent engine for auditable AI workflows in Rust, YAML, and Python.

zymi-core helps you build agent workflows you can inspect after the fact. Every run is recorded as an immutable event stream in SQLite, agent side effects are mediated through intentions and boundary contracts, and pipelines execute as DAGs with parallel steps when possible.

Highlights

  • Auditable by default: every state change is persisted as an event with hash-chain verification.
  • Safer side effects: agents emit intentions first; contracts and approvals decide what is allowed to execute.
  • Practical workflows: define agents and DAG pipelines in YAML, then run them from a small CLI.
  • Declarative custom tools: add HTTP and shell tools in tools/*.yml — no Rust code required. zymi init includes a working example.
  • Flexible integration points: use the Rust crate, Python bindings, or both — Python can drive pipelines directly via Runtime.for_project(...).run_pipeline(...), no subprocess.
  • LLM-provider ready: OpenAI-compatible providers, Anthropic support, Python tools, and LangFuse event services.
  • Automatic context management: observation masking compresses older tool results in-place (~2x cost reduction, no extra LLM calls), with LLM summarization as a graduated fallback when the context grows further.
  • JSON Schemas for configs: zymi schema project|agent|pipeline|tool outputs draft-07 JSON Schema for IDE autocomplete and LLM-assisted config generation.

Installation

If you want to... Install with...
CLI + Python bindings pip install zymi-core
Rust crate only zymi-core = "0.1"

pip install zymi-core gives you both the zymi CLI command and the zymi_core Python module.

Quick Start

# Install
pip install zymi-core

# Create a demo project
mkdir zymi-demo
cd zymi-demo
zymi init --example research

# Add your LLM provider config to project.yml, then run the pipeline
zymi run research -i topic="event sourcing"

# Inspect what happened
zymi events --limit 20
zymi verify

For example, this is enough to get started with OpenAI:

llm:
  provider: openai
  model: gpt-4o
  api_key: ${env.OPENAI_API_KEY}

What this gives you:

  • project.yml for provider config, policies, contracts, and defaults
  • agents/ for agent definitions
  • pipelines/ for DAG workflows
  • tools/web_search.yml — a declarative tool example, ready to wire up to a search provider
  • .zymi/events.db for the append-only event log
  • output/ and memory/ directories in the research example

Common CLI commands

zymi init --name my-project
zymi init --example research

zymi run main -i task="Summarize the architecture"
zymi run research -i topic="Rust event sourcing"

# Long-running mode: react to PipelineRequested events from any process
zymi serve research

zymi events
zymi events --stream conversation-1
zymi events --stream conversation-1 --verbose
zymi events --kind tool_call_completed --raw

zymi verify
zymi verify --stream conversation-1

# Discover what's in the project / what has run
zymi pipelines
zymi runs                          # all runs, newest first
zymi runs --pipeline research --limit 20

# Interactive 3-panel TUI: runs ▸ pipeline graph ▸ events
# Inside: Tab cycles panels, Enter expands, f follows tail,
#         Shift+R on a graph node forks-resumes from that step.
zymi observe
zymi observe --run pipeline-research-abc123

# Fork-resume an earlier run from a chosen step (ADR-0018).
# Steps upstream of the fork are frozen — their events are copied verbatim;
# the fork step + DAG-descendants re-run against the current configs on disk.
zymi resume pipeline-research-abc123 --from-step writer
zymi resume pipeline-research-abc123 --from-step writer --dry-run   # preview only

# JSON Schema for configs (useful for IDE autocomplete / LLM generation)
zymi schema project
zymi schema --all

Project Layout

A zymi project is just a directory with YAML files:

my-project/
  project.yml
  agents/
    default.yml
  pipelines/
    main.yml
  tools/          # declarative custom tools
    web_search.yml
  .zymi/
    events.db

The default scaffold created by zymi init is intentionally small:

# project.yml
name: my-project
version: "0.1"

defaults:
  timeout_secs: 30
  max_iterations: 10

policy:
  enabled: true
  allow: ["ls *", "cat *", "echo *"]
  deny: ["rm -rf *"]

# optional — tune context window budget
runtime:
  context:
    observation_window: 10
    soft_cap_chars: 400000
    hard_cap_chars: 600000
    min_tail_turns: 4
# agents/default.yml
name: default
description: "Default agent"
tools:
  - web_search
  - read_file
  - write_memory
max_iterations: 10
# pipelines/main.yml
name: main

steps:
  - id: process
    agent: default
    task: "${inputs.task}"

input:
  type: text

output:
  step: process

The default scaffold also creates tools/web_search.yml — a declarative tool with a shell placeholder and commented-out configs for Brave Search, SerpAPI, and Google Custom Search. Uncomment one, set the API key, and the agent's web_search tool starts returning real results.

Declarative Custom Tools

Drop a YAML file into tools/ to give your agents new capabilities without writing code:

# tools/slack_post.yml
name: slack_post
description: "Post a message to a Slack channel"
parameters:
  type: object
  properties:
    channel:
      type: string
    text:
      type: string
  required: [channel, text]
implementation:
  kind: http
  method: POST
  url: "https://slack.com/api/chat.postMessage"
  headers:
    Authorization: "Bearer ${env.SLACK_TOKEN}"
    Content-Type: "application/json"
  body_template: '{"channel": "${args.channel}", "text": "${args.text}"}'

Then reference it in an agent:

# agents/notifier.yml
name: notifier
tools:
  - web_search
  - slack_post   # ← the custom tool

${env.*} variables are resolved at parse time; ${args.*} are resolved at call time from the LLM's arguments. Name collisions with built-in tools are a hard error.

Python Bindings

The same pip install zymi-core that gives you the CLI also exposes a Runtime for running pipelines directly, plus the lower-level Event, EventBus, EventStore, Subscription, and ToolRegistry primitives for custom integrations.

Run a pipeline from Python

from zymi_core import Runtime

# Loads project.yml + agents/ + pipelines/ from the given directory and
# builds the same Runtime `zymi run` and `zymi serve` use. `approval` is
# either "terminal" (fail-closed prompt on stdin, matches `zymi run`) or
# "none" (intentions tagged RequiresHumanApproval resolve to a deny).
rt = Runtime.for_project(".", approval="terminal")

result = rt.run_pipeline("research", {"topic": "rust event sourcing"})
print(result.success, result.final_output)
for step in result.step_results:
    print(step.step_id, step.iterations, step.success)

rt.bus() and rt.store() hand out Python wrappers over the runtime's own Arcs, so any subscriber you attach there sees exactly the events the handler publishes — there is no second bus over the same SQLite file.

Tool registry and event primitives

from zymi_core import ToolRegistry

registry = ToolRegistry()

@registry.tool
def search(query: str) -> str:
    return f"Results for: {query}"

result = registry.call("search", '{"query":"rust async"}')
intention_json = registry.to_intention("search", '{"query":"rust async"}')
definitions = registry.definitions()

For lower-level event primitives the same package gives you the event store and bus directly:

from zymi_core import Event, EventBus, EventStore

store = EventStore("./events.db")
bus = EventBus(store)
subscription = bus.subscribe()

event = Event(
    stream_id="conversation-1",
    kind={"type": "UserMessageReceived", "data": {
        "content": {"User": "Hello"},
        "connector": "python",
    }},
    source="python",
)

bus.publish(event)
received = subscription.try_recv()

Multi-Process Integration (Django, Celery, scripts)

The Python wrapper for EventStore opens the same SQLite file the Rust side uses. There is no second IPC channel — events written from one process are visible to every other process that opens the same store, and a long-running zymi serve picks them up via a polling tail watcher (see ADR-0012).

The canonical pattern: a web app publishes a PipelineRequested event, zymi serve runs the pipeline, and the result comes back as a PipelineCompleted event with the same correlation_id.

Terminal A — long-running Rust service:

cd my-zymi-project
zymi serve research

Terminal B — any Python process (e.g. a Django view):

import uuid
from zymi_core import Event, EventBus, EventStore

store = EventStore(".zymi/events.db")
bus = EventBus(store)

correlation_id = str(uuid.uuid4())
sub = bus.subscribe_correlation(correlation_id)

event = Event(
    stream_id=f"web-req-{correlation_id}",
    kind={"type": "PipelineRequested", "data": {
        "pipeline": "research",
        "inputs": {"topic": "rust event sourcing"},
    }},
    source="django",
)
event.with_correlation(correlation_id)
bus.publish(event)

# Block until the serve process publishes PipelineCompleted with the
# same correlation_id (timeout in seconds).
result = sub.recv(timeout_secs=300)
print(result.kind)  # {"type": "PipelineCompleted", "data": {...}}

Because the SQLite store is the single source of truth, you also get free auditing: zymi events --stream web-req-... shows everything that happened during the run, and zymi verify checks the hash chain.

Inside zymi serve the PipelineRequested → RunPipeline translation is done by EventCommandRouter (see ADR-0013). It is re-exported from zymi_core::runtime, so if you are building your own scheduler or bot adapter you can wire the same router against your own Runtime without copy-pasting cli/serve.rs.

Rust Crate

Add the crate to your Cargo.toml:

[dependencies]
zymi-core = "0.1"

Example:

use std::sync::Arc;
use zymi_core::{open_store, Event, EventBus, EventKind, Message, StoreBackend};

let store = open_store(StoreBackend::Sqlite { path: "events.db".into() })?;
let bus = EventBus::new(store.clone());

let mut rx = bus.subscribe().await;

let event = Event::new(
    "conversation-1".into(),
    EventKind::UserMessageReceived {
        content: Message::User("Hello".into()),
        connector: "cli".into(),
    },
    "cli".into(),
);

bus.publish(event).await?;
let received = rx.recv().await.unwrap();
assert_eq!(received.kind_tag(), "user_message_received");

let verified_count = store.verify_chain("conversation-1").await?;

For cross-process delivery in your own binary, spawn a StoreTailWatcher on the same store/bus — it polls for events written by other processes and fans them out into local subscribers without re-persisting them:

use std::time::Duration;
use zymi_core::StoreTailWatcher;

let watcher = StoreTailWatcher::new(store.clone(), bus.clone())
    .with_interval(Duration::from_millis(100))
    .spawn();

// ... later, on shutdown:
watcher.stop().await;

How It Works

zymi-core is built around a small set of ideas:

  1. Every meaningful state change becomes an event. The SQLite event store is the source of truth.
  2. Agents express intentions, not side effects. Intentions are evaluated against boundary contracts before execution.
  3. Pipelines are DAGs. Independent steps can run in parallel, while dependencies remain explicit.
  4. Context is event-sourced too. The agent's working context is reassembled from the event log each iteration — older observations are masked automatically, and hybrid compaction kicks in when the budget is exceeded.
  5. Runs stay replayable. You can inspect events with zymi events --stream <id> and verify hash-chain integrity with zymi verify.
  6. Custom tools are declarative. HTTP tools live in tools/*.yml and are dispatched at runtime — no Rust code, no rebuild.

Core intention types include ExecuteShellCommand, WriteFile, ReadFile, WebSearch, WebScrape, WriteMemory, SpawnSubAgent, and CallCustomTool.

Feature Flags (Rust crate)

The pip wheel ships with python and cli enabled. These flags are relevant when depending on the Rust crate directly.

Feature Description
python PyO3 bindings for the _zymi_core Python extension module
cli The zymi CLI binary
runtime Async runtime and HTTP dependencies used by runtime integrations
webhook HTTP approval handler built on Axum
services Event-bus services such as LangFuse

Development

cargo test
cargo test --features services,webhook

cargo clippy -- -D warnings
cargo clippy --features services -- -D warnings

maturin develop --features python,cli

License

MIT

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

zymi_core-0.2.1.tar.gz (249.7 kB view details)

Uploaded Source

Built Distributions

If you're not sure about the file name format, learn more about wheel file names.

zymi_core-0.2.1-cp311-cp311-win_amd64.whl (4.6 MB view details)

Uploaded CPython 3.11Windows x86-64

zymi_core-0.2.1-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (4.5 MB view details)

Uploaded CPython 3.11manylinux: glibc 2.17+ x86-64

zymi_core-0.2.1-cp311-cp311-manylinux_2_17_aarch64.manylinux2014_aarch64.whl (4.2 MB view details)

Uploaded CPython 3.11manylinux: glibc 2.17+ ARM64

zymi_core-0.2.1-cp311-cp311-macosx_11_0_arm64.whl (4.1 MB view details)

Uploaded CPython 3.11macOS 11.0+ ARM64

zymi_core-0.2.1-cp311-cp311-macosx_10_12_x86_64.whl (4.3 MB view details)

Uploaded CPython 3.11macOS 10.12+ x86-64

File details

Details for the file zymi_core-0.2.1.tar.gz.

File metadata

  • Download URL: zymi_core-0.2.1.tar.gz
  • Upload date:
  • Size: 249.7 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.13.12

File hashes

Hashes for zymi_core-0.2.1.tar.gz
Algorithm Hash digest
SHA256 badf80788a6e224b3348e8d63e1507e38810b54d19d3f47c852272d522f114a6
MD5 fdfc19b27ad9406bd8b8f498b2811be7
BLAKE2b-256 4fd86a1b7d3dcc56e31d2e8ff224e3a19ca49637a42da4c5873a7aa2fd19837c

See more details on using hashes here.

Provenance

The following attestation bundles were made for zymi_core-0.2.1.tar.gz:

Publisher: release.yml on metravod/zymi-core

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file zymi_core-0.2.1-cp311-cp311-win_amd64.whl.

File metadata

  • Download URL: zymi_core-0.2.1-cp311-cp311-win_amd64.whl
  • Upload date:
  • Size: 4.6 MB
  • Tags: CPython 3.11, Windows x86-64
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.13.12

File hashes

Hashes for zymi_core-0.2.1-cp311-cp311-win_amd64.whl
Algorithm Hash digest
SHA256 2e452aeb79ed3af08f229c071dd4a9d3318795c7ec107edd69b6fbe368c2037f
MD5 0af9ead5e70cb41525351ee66f087b1d
BLAKE2b-256 2a34ee386308959cdb994efe05dced6f0fcd3bef5445b313ae479a2ea9a9a352

See more details on using hashes here.

Provenance

The following attestation bundles were made for zymi_core-0.2.1-cp311-cp311-win_amd64.whl:

Publisher: release.yml on metravod/zymi-core

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file zymi_core-0.2.1-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl.

File metadata

File hashes

Hashes for zymi_core-0.2.1-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl
Algorithm Hash digest
SHA256 945c7c8f79295b740e83fd5925b33adc30fefeed044a61a692ffeae2e99a4d98
MD5 7c09e1b1a546ff26dd53d96d58add298
BLAKE2b-256 9d76b42cdc65a54511b3ab2d792218588dd112d5d58836a1cecddf5e32d487a8

See more details on using hashes here.

Provenance

The following attestation bundles were made for zymi_core-0.2.1-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl:

Publisher: release.yml on metravod/zymi-core

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file zymi_core-0.2.1-cp311-cp311-manylinux_2_17_aarch64.manylinux2014_aarch64.whl.

File metadata

File hashes

Hashes for zymi_core-0.2.1-cp311-cp311-manylinux_2_17_aarch64.manylinux2014_aarch64.whl
Algorithm Hash digest
SHA256 ca160e6628ff5af017ca784a4cfba31f4ec76d0dd1d880d1f4c8c74a4b36d509
MD5 67e12fa2bfffff15399d2b0c69c11a39
BLAKE2b-256 130467de1bff62d60e95d113d8754e4b51b14968799ca9b6bd5e6650009851e8

See more details on using hashes here.

Provenance

The following attestation bundles were made for zymi_core-0.2.1-cp311-cp311-manylinux_2_17_aarch64.manylinux2014_aarch64.whl:

Publisher: release.yml on metravod/zymi-core

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file zymi_core-0.2.1-cp311-cp311-macosx_11_0_arm64.whl.

File metadata

File hashes

Hashes for zymi_core-0.2.1-cp311-cp311-macosx_11_0_arm64.whl
Algorithm Hash digest
SHA256 726c5cd6df19780b7088ada9d49045b522d378a6a3b8dd974cbe4f71b238ee4d
MD5 0842d25985778b36c659b1e6bf66e097
BLAKE2b-256 13c6ee2d3b3c69258ef25c473b08531328c0561f48623933e825b0bf30b13863

See more details on using hashes here.

Provenance

The following attestation bundles were made for zymi_core-0.2.1-cp311-cp311-macosx_11_0_arm64.whl:

Publisher: release.yml on metravod/zymi-core

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file zymi_core-0.2.1-cp311-cp311-macosx_10_12_x86_64.whl.

File metadata

File hashes

Hashes for zymi_core-0.2.1-cp311-cp311-macosx_10_12_x86_64.whl
Algorithm Hash digest
SHA256 3eff8c7a6550e9a145f4db07319d13662165af022af29990834f3f8bef94e42d
MD5 d92a1718460acc4561743093df1bde3d
BLAKE2b-256 32954a37225b8af2b2728c0077eb7880027f8865afe4ac244ea81ed676342df2

See more details on using hashes here.

Provenance

The following attestation bundles were made for zymi_core-0.2.1-cp311-cp311-macosx_10_12_x86_64.whl:

Publisher: release.yml on metravod/zymi-core

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page