Event-sourced agent engine — CLI and Python bindings for auditable AI workflows
Project description
zymi-core
Declarative agent engine — dbt for AI workflows, with event sourcing built in.
Pronounced zoomi — like dog zoomies.
Why zymi-core?
Most agent frameworks are imperative Python: you write a script that makes LLM calls, maybe persists some messages, and leaves you guessing about the rest. Debugging a bad run means reading logs, hoping you logged enough.
zymi-core inverts that:
- Declarative, like dbt. Describe agents, tools, pipelines, and integrations in YAML. The engine loads them, validates them, runs them as a DAG. No orchestration code to write.
- Event-sourced. Every state change — inbound message, LLM call, tool result, approval, outbound reply — is an immutable event persisted to SQLite with a per-stream hash chain. Runs are replayable, resumable, and auditable without extra logging.
- Boundary-safe. Agents emit intentions ("I want to write this file", "I want to run this shell command") that pass through contracts and optional human approval before execution. The unsafe thing never happens until someone said yes.
The ergonomic target: you can bring a useful agent online in minutes without writing code, and a year later still answer exactly what this agent did on any past run.
Run a Telegram agent in two minutes
This is the primary demo — a real chat bot you talk to on your phone, wired declaratively.
pip install zymi-core
mkdir telegram-agent && cd telegram-agent
zymi init --example telegram
# 1. Create a bot via @BotFather in Telegram; copy the token.
# 2. Fill .env:
cp .env.example .env # edit TELEGRAM_BOT_TOKEN + OPENAI_API_KEY
# 3. Open project.yml, replace "your_username_here" with your actual
# Telegram username (no @). This keeps strangers out of the bot.
source .env
zymi serve chat
Message the bot — it replies within a couple of seconds. Every inbound message, LLM call, and outbound reply is persisted to .zymi/events.db; watch it live with zymi observe.
The entire wiring lives in project.yml:
llm:
provider: openai
model: gpt-4o-mini
api_key: ${env.OPENAI_API_KEY}
connectors:
- type: http_poll # long-polls getUpdates (no HTTPS needed)
name: telegram
url: "https://api.telegram.org/bot${env.TELEGRAM_BOT_TOKEN}/getUpdates"
interval_secs: 2
extract:
items: "$.result[*]"
stream_id: "$.message.chat.id"
content: "$.message.text"
cursor: { param: offset, from_item: "$.update_id", plus_one: true, persist: true }
filter:
"$.message.from.username":
one_of: ["your_username_here"]
pipeline: chat # every accepted message → `zymi serve chat`
outputs:
- type: http_post # reply on every pipeline completion
name: telegram_reply
on: [ResponseReady]
url: "https://api.telegram.org/bot${env.TELEGRAM_BOT_TOKEN}/sendMessage"
headers: { Content-Type: "application/json" }
body_template: '{"chat_id":"{{ event.stream_id }}","text":{{ event.content | tojson }}}'
retry: { attempts: 3, backoff_secs: [1, 5, 30] }
That's it. No Rust. No Python. The scaffold ships with a small two-step DAG — respond (assistant with web_search / web_scrape tools) → polish (a brutally lazy reviewer that keeps the draft verbatim unless it's actually broken). Both steps are visible live in zymi observe.
Ask the bot to "announce that we're closing at 5pm" and the agent reaches for tools/broadcast.yml (requires_approval: true). The approvals: section in project.yml DMs you with ✅ / ❌ buttons; nothing goes out until you click. See Approvals on the bus below.
Want a real search backend? Open tools/web_search.yml, uncomment a provider block (Brave, Tavily, SerpAPI, Google), set its key in .env. Out of the box the bot still works — the assistant just answers from its own knowledge.
Same primitives elsewhere
http_inbound / http_poll / http_post cover webhooks and REST for most SaaS. Pick http_inbound when the service can POST to you over HTTPS, http_poll when you're behind NAT or the service has no webhooks. Filter recipes for the common cases:
# GitHub — only react to PR opens, not pushes or issues
filter:
"$.action": { equals: "opened" }
"$.pull_request.draft": { equals: false }
# Slack Events API — one channel, skip bot echoes
filter:
"$.event.channel": { equals: "C0123456789" }
"$.event.subtype": { equals: null }
Out of the box filter: supports one_of / equals. 429 rate-limiting is handled automatically (Retry-After header + Telegram's body-level hint are both honoured on retries).
Highlights
Pipelines are DAGs
A pipeline is a list of steps with depends_on: edges. Independent steps run in parallel, dependencies stay explicit.
# pipelines/research.yml
steps:
- id: search_web
agent: researcher
task: "Find articles on: ${inputs.topic}"
- id: search_deep
agent: researcher
task: "Find technical details on: ${inputs.topic}"
- id: analyze
agent: researcher
task: "Cross-reference findings, store a structured summary in memory."
depends_on: [search_web, search_deep] # runs after both finish
- id: write_report
agent: writer
task: "Read memory, write ./output/report.md."
depends_on: [analyze]
Try it: zymi init --example research. The scaffold pre-configures a two-agent pipeline with parallel search.
Replay, resume, observe
Because everything is an event, you don't just log runs — you keep them.
zymi runs # all pipeline runs
zymi events --stream pipeline-research-abc # every event in one run
zymi verify --stream pipeline-research-abc # hash-chain integrity check
zymi observe # 3-panel TUI: runs / DAG / events live
# Fork-resume an earlier run from a chosen step. Upstream steps are frozen;
# the fork step + DAG-descendants re-run against current configs on disk.
zymi resume pipeline-research-abc --from-step write_report
zymi resume pipeline-research-abc --from-step write_report --dry-run
Fork-resume is useful when you're iterating on a prompt or tool config: you don't have to re-burn the expensive early steps every time you tweak the later ones. See ADR-0018.
MCP servers — N tools per YAML entry
Declarative tools cover HTTP and shell. For anything heavier — filesystem sandbox, git client, search index, proprietary protocol — drop a Model Context Protocol server into project.yml and zymi-core handles the subprocess, handshake, restarts, and tool-catalog wiring.
mcp_servers:
- name: fs
command: [npx, -y, "@modelcontextprotocol/server-filesystem", ./sandbox]
allow: [read_text_file, write_file, list_directory]
restart: { max_restarts: 2, backoff_secs: [1, 5] }
Then in the agent:
tools:
- mcp__fs__read_text_file
- mcp__fs__write_file
- mcp__fs__list_directory
No per-tool schemas to author — they come from the server's tools/list at startup. Every call is audited as a normal tool event. Probe new servers before wiring:
zymi mcp probe fs -- npx -y @modelcontextprotocol/server-filesystem /tmp
zymi mcp probe gh --env GITHUB_PERSONAL_ACCESS_TOKEN=ghp_... \
-- npx -y @modelcontextprotocol/server-github
End-to-end demo: zymi init --example mcp. Full posture (PATH forwarding, env-isolation, restart policy) in ADR-0023.
Declarative custom tools
HTTP and shell tools live in tools/*.yml. No rebuild, no Rust.
# tools/slack_post.yml
name: slack_post
description: "Post a message to a Slack channel"
parameters:
type: object
properties:
channel: { type: string }
text: { type: string }
required: [channel, text]
implementation:
kind: http
method: POST
url: "https://slack.com/api/chat.postMessage"
headers:
Authorization: "Bearer ${env.SLACK_TOKEN}"
Content-Type: "application/json"
body_template: '{"channel": "${args.channel}", "text": "${args.text}"}'
${env.*} resolves at parse time; ${args.*} resolves at call time from LLM arguments. Collisions with built-in tools are a hard error.
Automatic context management
The agent's working context is reconstructed from the event log each iteration, not accumulated in a growing buffer. Older tool observations are masked in-place (~2× cost reduction, no extra LLM calls). When the token budget still gets tight, hybrid compaction summarises the oldest masked batch with a fast LLM call. Both caps are tunable:
runtime:
context:
observation_window: 10 # recent turns kept verbatim
soft_cap_chars: 400000 # triggers LLM summarisation
hard_cap_chars: 600000 # fatal if exceeded after compaction
min_tail_turns: 4
Design and trade-offs in ADR-0016.
Safer side effects
Agents don't take actions directly — they emit intentions (ExecuteShellCommand, WriteFile, ReadFile, WebSearch, SpawnSubAgent, CallCustomTool, …). Intentions pass through:
- Policy engine — shell command allow/deny patterns.
- Contracts — file-write boundaries, rate limits, tool-specific rules.
- Approval — declarative
approvals:channels (terminal / http / telegram), or fail-closed when no channel is configured.
Nothing with side effects runs until the intention is approved.
Approvals on the bus
Approvals are declarative in project.yml (ADR-0022). Pick a channel — terminal prompt, HTTP endpoint, or Telegram DM with inline ✅/❌ keyboard — and any tool flagged requires_approval: true routes through it.
default_approval_channel: ops_tg
approvals:
- type: telegram
name: ops_tg
bot_token: "${env.TELEGRAM_BOT_TOKEN}"
chat_id: "${env.TELEGRAM_ADMIN_CHAT_ID}"
bind: "127.0.0.1:8088"
callback_path: /telegram/approval
secret_token: "${env.TELEGRAM_WEBHOOK_SECRET}"
# tools/broadcast.yml — gated tool
name: broadcast
description: "Send an announcement to the team channel."
parameters: { type: object, properties: { message: { type: string } }, required: [message] }
requires_approval: true
implementation:
kind: shell
command_template: "post-to-slack ${args.message}"
When the agent calls broadcast, the bot DMs the admin chat with approve/deny buttons; the click flows back as ApprovalGranted / ApprovalDenied events. Resolution order is pipeline override → project default → fail-closed. Every step is on the event bus, so the audit trail is uniform with the rest of the run, and a hard crash mid-approval is repaired on next start: in-flight requests are redelivered to live channels, expired ones are sealed with ApprovalDenied{reason: restart_timeout}. End-to-end demo: zymi init --example telegram.
JSON Schemas for configs
IDE autocomplete and LLM-assisted config generation come free:
zymi schema project # draft-07 JSON Schema for project.yml
zymi schema --all
Python bindings
The same pip install zymi-core that gives you the CLI also exposes Runtime, Event, EventBus, EventStore, Subscription, ToolRegistry.
from zymi_core import Runtime
rt = Runtime.for_project(".", approval="terminal")
result = rt.run_pipeline("research", {"topic": "rust event sourcing"})
print(result.success, result.final_output)
rt.bus() and rt.store() hand out wrappers over the same Arcs the Rust runtime uses — a Python subscriber sees exactly what the handler publishes, no second bus over the same SQLite file.
Cross-process events (Django, Celery, scripts)
The SQLite store is the source of truth: events written from one process are visible to every other process that opens the same file, and zymi serve picks them up via a polling tail watcher (ADR-0012).
Canonical pattern — a web app publishes PipelineRequested, the long-running zymi serve runs the pipeline, the result comes back as PipelineCompleted with the same correlation_id.
# Terminal A: zymi serve research
# Terminal B: any Python process — Django view, Celery task, script
import uuid
from zymi_core import Event, EventBus, EventStore
store = EventStore(".zymi/events.db")
bus = EventBus(store)
correlation_id = str(uuid.uuid4())
sub = bus.subscribe_correlation(correlation_id)
event = Event(
stream_id=f"web-{correlation_id}",
kind={"type": "PipelineRequested", "data": {
"pipeline": "research",
"inputs": {"topic": "rust event sourcing"},
}},
source="django",
)
event.with_correlation(correlation_id)
bus.publish(event)
result = sub.recv(timeout_secs=300)
print(result.kind)
Inside zymi serve the PipelineRequested → RunPipeline translation is done by EventCommandRouter — re-exported from zymi_core::runtime, so your own scheduler or bot adapter can drive a Runtime without copying cli/serve.rs.
Rust crate
[dependencies]
zymi-core = "0.4"
use std::sync::Arc;
use zymi_core::{open_store, Event, EventBus, EventKind, Message, StoreBackend};
let store = open_store(StoreBackend::Sqlite { path: "events.db".into() })?;
let bus = EventBus::new(store.clone());
let mut rx = bus.subscribe().await;
let event = Event::new(
"conversation-1".into(),
EventKind::UserMessageReceived {
content: Message::User("Hello".into()),
connector: "cli".into(),
},
"cli".into(),
);
bus.publish(event).await?;
let received = rx.recv().await.unwrap();
let verified = store.verify_chain("conversation-1").await?;
Feature flags:
| Feature | Description |
|---|---|
python |
PyO3 bindings for the _zymi_core extension module |
cli |
The zymi CLI binary (implies connectors) |
runtime |
Async runtime and HTTP dependencies |
webhook |
HTTP approval handler (axum) |
services |
Event-bus services (LangFuse) |
connectors |
Declarative connectors: / outputs: (http_inbound / http_poll / http_post, ADR-0021) |
CLI reference
zymi init # minimal scaffold
zymi init --example telegram # declarative chat bot
zymi init --example research # parallel-search + writer pipeline
zymi init --example mcp # agent wired to an MCP server
zymi run <pipeline> -i key=value ... # one-shot run
zymi serve <pipeline> # long-running: react to PipelineRequested
zymi pipelines # list pipelines in the project
zymi runs [--pipeline NAME] [--limit N]
zymi events [--stream ID] [--kind TAG] [--raw]
zymi verify [--stream ID] # hash-chain integrity check
zymi observe [--run ID] # 3-panel TUI
zymi resume <run-id> --from-step <id> [--dry-run]
zymi mcp probe <name> -- <cmd> [args...]
zymi schema project|agent|pipeline|tool
zymi schema --all
How it works
- Every state change becomes an event. The SQLite event store with per-stream hash chain is the source of truth.
- Agents emit intentions, not side effects. Intentions are evaluated against policy + contracts + approvals before execution.
- Pipelines are DAGs. Independent steps run in parallel; dependencies stay explicit.
- Context is event-sourced. The working context is reassembled from the log each iteration — older observations masked, hybrid compaction when the budget gets tight.
- Runs stay replayable. Inspect with
zymi events, verify withzymi verify, fork-resume from any step.
More detail lives in adr/ — each architectural decision is a short markdown file.
Contributing
Bug reports, examples, and PRs welcome. See CONTRIBUTING.md for the dev-loop, test matrix, and ADR workflow.
License
MIT — see LICENSE.
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distributions
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file zymi_core-0.4.0.tar.gz.
File metadata
- Download URL: zymi_core-0.4.0.tar.gz
- Upload date:
- Size: 849.7 kB
- Tags: Source
- Uploaded using Trusted Publishing? Yes
- Uploaded via: twine/6.1.0 CPython/3.13.12
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
6821d78d3e75d8cc4eec2849c2dc095fc8d0cdcb48f592d7ac10d4719e3effe5
|
|
| MD5 |
6e5bcadf4c311d7e56002249293b494b
|
|
| BLAKE2b-256 |
d1152eda2f6ca7bde5b0aaf4d723b2999af47e71fe939b2c9da34d858fa24700
|
Provenance
The following attestation bundles were made for zymi_core-0.4.0.tar.gz:
Publisher:
release.yml on metravod/zymi-core
-
Statement:
-
Statement type:
https://in-toto.io/Statement/v1 -
Predicate type:
https://docs.pypi.org/attestations/publish/v1 -
Subject name:
zymi_core-0.4.0.tar.gz -
Subject digest:
6821d78d3e75d8cc4eec2849c2dc095fc8d0cdcb48f592d7ac10d4719e3effe5 - Sigstore transparency entry: 1428802926
- Sigstore integration time:
-
Permalink:
metravod/zymi-core@053f1d044b18e0b3d11de814bd7841f1af46f051 -
Branch / Tag:
refs/tags/v0.4.0 - Owner: https://github.com/metravod
-
Access:
public
-
Token Issuer:
https://token.actions.githubusercontent.com -
Runner Environment:
github-hosted -
Publication workflow:
release.yml@053f1d044b18e0b3d11de814bd7841f1af46f051 -
Trigger Event:
push
-
Statement type:
File details
Details for the file zymi_core-0.4.0-cp311-cp311-win_amd64.whl.
File metadata
- Download URL: zymi_core-0.4.0-cp311-cp311-win_amd64.whl
- Upload date:
- Size: 5.9 MB
- Tags: CPython 3.11, Windows x86-64
- Uploaded using Trusted Publishing? Yes
- Uploaded via: twine/6.1.0 CPython/3.13.12
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
df9fca169efaafe7c6737eb073d3612d603283f18327f92ac83fd44dfc2e5976
|
|
| MD5 |
438d562ac972a370c4c7ccfbf5f78497
|
|
| BLAKE2b-256 |
6842f56412edbab7f842320c736e4c1bf4bac71a1b9909ce3f89a6fd83ddcf2c
|
Provenance
The following attestation bundles were made for zymi_core-0.4.0-cp311-cp311-win_amd64.whl:
Publisher:
release.yml on metravod/zymi-core
-
Statement:
-
Statement type:
https://in-toto.io/Statement/v1 -
Predicate type:
https://docs.pypi.org/attestations/publish/v1 -
Subject name:
zymi_core-0.4.0-cp311-cp311-win_amd64.whl -
Subject digest:
df9fca169efaafe7c6737eb073d3612d603283f18327f92ac83fd44dfc2e5976 - Sigstore transparency entry: 1428802928
- Sigstore integration time:
-
Permalink:
metravod/zymi-core@053f1d044b18e0b3d11de814bd7841f1af46f051 -
Branch / Tag:
refs/tags/v0.4.0 - Owner: https://github.com/metravod
-
Access:
public
-
Token Issuer:
https://token.actions.githubusercontent.com -
Runner Environment:
github-hosted -
Publication workflow:
release.yml@053f1d044b18e0b3d11de814bd7841f1af46f051 -
Trigger Event:
push
-
Statement type:
File details
Details for the file zymi_core-0.4.0-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl.
File metadata
- Download URL: zymi_core-0.4.0-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl
- Upload date:
- Size: 5.7 MB
- Tags: CPython 3.11, manylinux: glibc 2.17+ x86-64
- Uploaded using Trusted Publishing? Yes
- Uploaded via: twine/6.1.0 CPython/3.13.12
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
2a4b2096d36003dbf915c011fd6f0316eece7bdefd61f584c111a5a7d75f9c25
|
|
| MD5 |
782a8da3860f34eecbfa67ddcb919d7c
|
|
| BLAKE2b-256 |
9ad8f56034a7c4c346fa23283c06a9f8631cbed4b927d14fd8d1cd3fe5093822
|
Provenance
The following attestation bundles were made for zymi_core-0.4.0-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl:
Publisher:
release.yml on metravod/zymi-core
-
Statement:
-
Statement type:
https://in-toto.io/Statement/v1 -
Predicate type:
https://docs.pypi.org/attestations/publish/v1 -
Subject name:
zymi_core-0.4.0-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl -
Subject digest:
2a4b2096d36003dbf915c011fd6f0316eece7bdefd61f584c111a5a7d75f9c25 - Sigstore transparency entry: 1428802931
- Sigstore integration time:
-
Permalink:
metravod/zymi-core@053f1d044b18e0b3d11de814bd7841f1af46f051 -
Branch / Tag:
refs/tags/v0.4.0 - Owner: https://github.com/metravod
-
Access:
public
-
Token Issuer:
https://token.actions.githubusercontent.com -
Runner Environment:
github-hosted -
Publication workflow:
release.yml@053f1d044b18e0b3d11de814bd7841f1af46f051 -
Trigger Event:
push
-
Statement type:
File details
Details for the file zymi_core-0.4.0-cp311-cp311-manylinux_2_17_aarch64.manylinux2014_aarch64.whl.
File metadata
- Download URL: zymi_core-0.4.0-cp311-cp311-manylinux_2_17_aarch64.manylinux2014_aarch64.whl
- Upload date:
- Size: 5.3 MB
- Tags: CPython 3.11, manylinux: glibc 2.17+ ARM64
- Uploaded using Trusted Publishing? Yes
- Uploaded via: twine/6.1.0 CPython/3.13.12
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
9a44318242ea519fbfea9ff08337f41a1d5062e1ec1f3b86c836866a950679ad
|
|
| MD5 |
8e5b696de402b02d8cf7cf03b699e924
|
|
| BLAKE2b-256 |
749e1c21bd4dd7e05c6e3a9a9831e6f60143278d78c3c123a4e0c3a407fd4e5b
|
Provenance
The following attestation bundles were made for zymi_core-0.4.0-cp311-cp311-manylinux_2_17_aarch64.manylinux2014_aarch64.whl:
Publisher:
release.yml on metravod/zymi-core
-
Statement:
-
Statement type:
https://in-toto.io/Statement/v1 -
Predicate type:
https://docs.pypi.org/attestations/publish/v1 -
Subject name:
zymi_core-0.4.0-cp311-cp311-manylinux_2_17_aarch64.manylinux2014_aarch64.whl -
Subject digest:
9a44318242ea519fbfea9ff08337f41a1d5062e1ec1f3b86c836866a950679ad - Sigstore transparency entry: 1428802927
- Sigstore integration time:
-
Permalink:
metravod/zymi-core@053f1d044b18e0b3d11de814bd7841f1af46f051 -
Branch / Tag:
refs/tags/v0.4.0 - Owner: https://github.com/metravod
-
Access:
public
-
Token Issuer:
https://token.actions.githubusercontent.com -
Runner Environment:
github-hosted -
Publication workflow:
release.yml@053f1d044b18e0b3d11de814bd7841f1af46f051 -
Trigger Event:
push
-
Statement type:
File details
Details for the file zymi_core-0.4.0-cp311-cp311-macosx_11_0_arm64.whl.
File metadata
- Download URL: zymi_core-0.4.0-cp311-cp311-macosx_11_0_arm64.whl
- Upload date:
- Size: 5.1 MB
- Tags: CPython 3.11, macOS 11.0+ ARM64
- Uploaded using Trusted Publishing? Yes
- Uploaded via: twine/6.1.0 CPython/3.13.12
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
9e50e1421991ef85b9e149baafcee5faa2e0d099641c8058df385b25c19bd4d5
|
|
| MD5 |
6fa7a757d02ea8531f0ef87e607a18d3
|
|
| BLAKE2b-256 |
2e16b8c1b72b86aebe98cb9c78e76772682e02be85db4c90e61215e024948182
|
Provenance
The following attestation bundles were made for zymi_core-0.4.0-cp311-cp311-macosx_11_0_arm64.whl:
Publisher:
release.yml on metravod/zymi-core
-
Statement:
-
Statement type:
https://in-toto.io/Statement/v1 -
Predicate type:
https://docs.pypi.org/attestations/publish/v1 -
Subject name:
zymi_core-0.4.0-cp311-cp311-macosx_11_0_arm64.whl -
Subject digest:
9e50e1421991ef85b9e149baafcee5faa2e0d099641c8058df385b25c19bd4d5 - Sigstore transparency entry: 1428802930
- Sigstore integration time:
-
Permalink:
metravod/zymi-core@053f1d044b18e0b3d11de814bd7841f1af46f051 -
Branch / Tag:
refs/tags/v0.4.0 - Owner: https://github.com/metravod
-
Access:
public
-
Token Issuer:
https://token.actions.githubusercontent.com -
Runner Environment:
github-hosted -
Publication workflow:
release.yml@053f1d044b18e0b3d11de814bd7841f1af46f051 -
Trigger Event:
push
-
Statement type:
File details
Details for the file zymi_core-0.4.0-cp311-cp311-macosx_10_12_x86_64.whl.
File metadata
- Download URL: zymi_core-0.4.0-cp311-cp311-macosx_10_12_x86_64.whl
- Upload date:
- Size: 5.4 MB
- Tags: CPython 3.11, macOS 10.12+ x86-64
- Uploaded using Trusted Publishing? Yes
- Uploaded via: twine/6.1.0 CPython/3.13.12
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
cb8fdb2b23ecaaf6aa9c53d278ead3aba9febcc93d14127d214a0f169bec6f8c
|
|
| MD5 |
996e5c2028657dd4f744ded1597626a7
|
|
| BLAKE2b-256 |
fcc7b77f62e7a3407ba88bb601d927a37a10b32649c27dfd296bc2e45a85c50e
|
Provenance
The following attestation bundles were made for zymi_core-0.4.0-cp311-cp311-macosx_10_12_x86_64.whl:
Publisher:
release.yml on metravod/zymi-core
-
Statement:
-
Statement type:
https://in-toto.io/Statement/v1 -
Predicate type:
https://docs.pypi.org/attestations/publish/v1 -
Subject name:
zymi_core-0.4.0-cp311-cp311-macosx_10_12_x86_64.whl -
Subject digest:
cb8fdb2b23ecaaf6aa9c53d278ead3aba9febcc93d14127d214a0f169bec6f8c - Sigstore transparency entry: 1428802936
- Sigstore integration time:
-
Permalink:
metravod/zymi-core@053f1d044b18e0b3d11de814bd7841f1af46f051 -
Branch / Tag:
refs/tags/v0.4.0 - Owner: https://github.com/metravod
-
Access:
public
-
Token Issuer:
https://token.actions.githubusercontent.com -
Runner Environment:
github-hosted -
Publication workflow:
release.yml@053f1d044b18e0b3d11de814bd7841f1af46f051 -
Trigger Event:
push
-
Statement type: