Skip to main content

Durable execution for AI agents, built on ZenML

Project description

Kitaru

You build your agents. We make them durable.

Kitaru (来る, "to arrive") — open-source agent infrastructure for Python. Any framework. Any cloud. Built on ZenML.

PyPI Python License Slack

Docs · Quick Start · Examples · Getting Started Guide


Kitaru Dashboard

Your agent crashed at step 6. Kitaru replays from step 6 — not from scratch. Add two decorators to your existing Python agent and get crash recovery, human approval gates, cost tracking, and a full dashboard. No rewrite. No framework lock-in. No distributed systems overhead.

Why Kitaru?

Python-first, no graph DSL

Write normal Python. Use if, for, try/except — whatever your agent needs. Kitaru gives you two decorators (@flow and @checkpoint) and a handful of utility functions. That's it.

from kitaru import checkpoint, flow

@checkpoint
def research(topic: str) -> str:
    return do_research(topic)

@checkpoint
def write_draft(research: str) -> str:
    return generate_draft(research)

@flow
def writing_agent(topic: str) -> str:
    data = research(topic)
    return write_draft(data)

result = writing_agent.run("quantum computing").wait()

Deployment flexibility

No workers, no message queues, no distributed systems PhD required. Kitaru runs locally with zero config, and scales to production with a single server backed by a SQL database. Deploy your agents anywhere — Kubernetes, Vertex AI, SageMaker, or AzureML — using Kitaru's stack abstraction.

Built-in dashboard

Every execution is observable from day one. See your agent runs, inspect checkpoint outputs, track LLM costs, and approve human-in-the-loop wait steps — all from a visual dashboard that ships with the Kitaru server. The dashboard ships free, with the server, from day one.

To start that server locally, run kitaru login after installing kitaru[local]. To connect to an existing remote server, run kitaru login <server>.

Quick Start

Install

pip install kitaru

Or with uv (recommended):

uv pip install kitaru

Optional: start a local Kitaru server

Flows run locally by default with the base install. If you also want the local dashboard and REST API, install the local extra and then run bare kitaru login:

uv pip install "kitaru[local]"
kitaru login
kitaru status

Optional: connect to an existing remote Kitaru server

If you already have a deployed Kitaru server, connect to it explicitly:

kitaru login https://my-server.example.com
# add --project <PROJECT> or other remote-login flags if your setup requires them
kitaru status

Initialize your project

kitaru init

Write your first flow

# agent.py
from kitaru import checkpoint, flow

@checkpoint
def fetch_data(url: str) -> str:
    return "some data"

@checkpoint
def process_data(data: str) -> str:
    return data.upper()

@flow
def my_agent(url: str) -> str:
    data = fetch_data(url)
    return process_data(data)

result = my_agent.run("https://example.com").wait()
print(result)  # SOME DATA

Run it

python agent.py

Every checkpoint's output is persisted automatically. You can inspect what happened, replay from any checkpoint, or resume a waiting flow:

kitaru executions list
kitaru executions get <EXECUTION_ID>
kitaru executions logs <EXECUTION_ID>
kitaru executions replay <EXECUTION_ID> --from process_data

Learn more

Resource Description
Getting Started Guide Full setup walkthrough with all examples
Documentation Complete reference and guides
Examples Runnable workflows for every feature
Stack Selection Guide Deploy to Kubernetes, Vertex AI, SageMaker, or AzureML

Contributing

We welcome contributions! See CONTRIBUTING.md for development setup, code style, and how to submit changes. The default branch is develop — all PRs should target it.

Community and support

License

Apache 2.0

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

kitaru-0.2.1.tar.gz (11.4 MB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

kitaru-0.2.1-py3-none-any.whl (2.8 MB view details)

Uploaded Python 3

File details

Details for the file kitaru-0.2.1.tar.gz.

File metadata

  • Download URL: kitaru-0.2.1.tar.gz
  • Upload date:
  • Size: 11.4 MB
  • Tags: Source
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.13.7

File hashes

Hashes for kitaru-0.2.1.tar.gz
Algorithm Hash digest
SHA256 2c61d41734145044d1413e6addda6bceae240e7631a1030f5defb3be494e9aef
MD5 5dac937f2f22afb2e492756392874c1a
BLAKE2b-256 7e7c57f18e5d8e0886b61b9ba9cc4bd642dd78c789f2ad8dc314bd6774d970f2

See more details on using hashes here.

Provenance

The following attestation bundles were made for kitaru-0.2.1.tar.gz:

Publisher: release.yml on zenml-io/kitaru

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file kitaru-0.2.1-py3-none-any.whl.

File metadata

  • Download URL: kitaru-0.2.1-py3-none-any.whl
  • Upload date:
  • Size: 2.8 MB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.13.7

File hashes

Hashes for kitaru-0.2.1-py3-none-any.whl
Algorithm Hash digest
SHA256 1db6ef5560ed3a07fcfe74a00fb7a6486cdc8800526c68f385047033fc34fa3f
MD5 51c86657492b5bf44bac7433f34ad6f7
BLAKE2b-256 b599607b0b32fa0e638f4f22fdffde6c2c7384df073db3cbaeb6b4efe784aa5e

See more details on using hashes here.

Provenance

The following attestation bundles were made for kitaru-0.2.1-py3-none-any.whl:

Publisher: release.yml on zenml-io/kitaru

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page