Skip to main content

A Python SDK for building AI agents with LLM integration

Project description

zai-adk-python

English | 简体中文

A Python SDK for building AI agents with LLM integration. Provides a unified interface for multiple LLM providers (OpenAI, Anthropic) with native tool calling, sandboxed execution, skills system, and observability.

Features

  • Unified LLM Interface — Single API for OpenAI and Anthropic with provider-specific optimizations
  • Tool System — FastAPI-style dependency injection with automatic JSON schema generation
  • Sandboxed Execution — Secure filesystem and command isolation for agent tools
  • Skills System — Reusable prompt instructions with hot-reload support
  • MCP Integration — First-class support for Model Context Protocol servers
  • Event Streaming — Real-time agent events (thinking, tool calls, responses)
  • Token Tracking — Built-in cost calculation and usage monitoring
  • Observability — Laminar integration for tracing and debugging

Requirements

  • Python >= 3.11
  • uv for dependency management

Installation

# Clone the repository
git clone <repo-url>
cd zai-adk-python

# Install dependencies
uv sync --group dev

Configuration

Create a .env file in the project root:

# Anthropic
ANTHROPIC_BASE_URL=https://open.bigmodel.cn/api/anthropic
ANTHROPIC_API_KEY=your-api-key
ANTHROPIC_MODEL=glm-4.7

# OpenAI (optional)
OPENAI_BASE_URL=https://open.bigmodel.cn/api/paas/v4/
OPENAI_API_KEY=your-api-key
OPENAI_MODEL=glm-4.7

Quick Start

Basic Agent

import asyncio
from zai_adk import Agent
from zai_adk.llm.anthropic.chat import ChatAnthropic
from zai_adk.tools import tool

@tool("Calculate the sum of two numbers")
async def add(a: int, b: int) -> int:
    """Add two numbers together."""
    return a + b

async def main():
    agent = Agent(
        llm=ChatAnthropic(model="claude-sonnet-4-5-20250929"),
        tools=[add],
    )

    response = await agent.query("What is 25 + 17?")
    print(response)

asyncio.run(main())

Streaming Events

from zai_adk.agent import ToolCallEvent, ToolResultEvent, FinalResponseEvent

async for event in agent.query_stream("Help me with a task"):
    match event:
        case ToolCallEvent(tool=name, args=args):
            print(f"[Calling {name}]")
        case ToolResultEvent(tool=name, result=result):
            print(f"[Result from {name}]: {result}")
        case FinalResponseEvent(content=text):
            print(f"[Final]: {text}")

Sandboxed File Operations

from pathlib import Path
from zai_adk.sandbox import SandboxConfig
from zai_adk.tools.builtin.sandbox import bash, fs_read, fs_write

agent = Agent(
    llm=ChatAnthropic(model="claude-sonnet-4-5-20250929"),
    tools=[bash, fs_read, fs_write],
    sandbox=SandboxConfig(
        kind="local",
        work_dir="./workspace",
        enforce_boundary=True,
    ),
)

Skills System

from zai_adk.skills import Skill, SkillsManager
from zai_adk.tools.builtin import skills

skills_manager = SkillsManager()
skills_manager.register(
    Skill(
        name="code-review",
        description="Review code for bugs and security issues",
        instructions="""You are a code reviewer...""",
    )
)

agent = Agent(
    llm=ChatAnthropic(model="claude-sonnet-4-5-20250929"),
    tools=[skills],
    skills_manager=skills_manager,
)

Or use filesystem-based skills with auto-discovery:

agent = Agent(
    llm=ChatAnthropic(model="claude-sonnet-4-5-20250929"),
    tools=[skills],
    skills_dir="./skills",  # Scans for SKILL.md files
)

Tool Creation

Use the @tool decorator with type hints for automatic schema generation:

from typing import Annotated
from zai_adk.tools import tool, Depends

# Simple tool
@tool("Get current weather")
async def get_weather(location: str, unit: str = "celsius") -> str:
    """Get the weather for a location."""
    return f"Weather in {location}: 22° {unit}"

# With dependency injection
@tool("Search the database")
async def search_db(
    query: str,
    db_conn: Annotated[DbConnection, Depends(get_db)],
) -> list[dict]:
    """Search the database for matching records."""
    return await db_conn.execute(query)

Dependency Injection

from zai_adk.sandbox import get_sandbox
from zai_adk.sandbox.base import Sandbox

@tool("Execute a shell command")
async def bash(
    command: str,
    sandbox: Annotated[Sandbox, Depends(get_sandbox)],
) -> str:
    """Run a command in the sandboxed environment."""
    result = await sandbox.exec(command)
    return result.stdout

Ephemeral Tools

Keep only the last N tool outputs in message history:

@tool("Read a file", ephemeral=1)
async def read_file(path: str) -> str:
    """Only the most recent result is kept in context."""
    return Path(path).read_text()

Agent Modes

CLI Mode (Default)

Agent stops automatically when the LLM returns text without tool calls:

agent = Agent(llm=..., tools=[...], mode="cli")
response = await agent.query("List all Python files")

Autonomous Mode

Agent requires an explicit done tool call to signal completion:

@tool("Signal task completion")
async def done(message: str) -> str:
    return f"TASK COMPLETE: {message}"

agent = Agent(llm=..., tools=[..., done], mode="autonomous")

MCP Integration

from zai_adk.mcp import MCPServerConfig

agent = Agent(
    llm=ChatAnthropic(model="claude-sonnet-4-5-20250929"),
    mcp_servers=[
        MCPServerConfig(
            name="filesystem",
            command="npx",
            args=["-y", "@modelcontextprotocol/server-filesystem", "./workspace"],
        ),
    ],
)

Protocol Support

The SDK supports multiple output protocols through a composable interface:

AG-UI Protocol

from zai_adk import Agent
from zai_adk.protocols.agui import AGUIProtocol

agent = Agent(llm=llm, tools=[tools])
agui = AGUIProtocol(agent)

async for event in agui.query_stream("Hello"):
    # Handle AG-UI events
    ...

SSE Streaming for HTTP

from fastapi import FastAPI
from fastapi.responses import StreamingResponse
from zai_adk import Agent
from zai_adk.protocols.agui import AGUIProtocol

app = FastAPI()
agent = Agent(llm=llm, tools=[tools])
agui = AGUIProtocol(agent)

@app.post("/chat")
async def chat(message: str):
    async def stream():
        async for sse in agui.query_stream_sse(message):
            yield sse

    return StreamingResponse(stream(), media_type="text/event-stream")

See docs/protocols.md for more details on implementing custom protocols.

Testing

# Run all tests
uv run pytest tests/ -v

# Skip tests requiring real LLM API calls
uv run pytest tests/ -v -m "not llm"

# Run specific test
uv run pytest tests/llm/test_anthropic_chat.py::test_anthropic_basic_chat -v

Examples

Example Description
claude_code.py Claude Code-style tools with sandboxed filesystem
skills_usage.py Skills system patterns and hot-reload
mcp_usage.py Model Context Protocol integration
todo_usage.py Todo list management example

Run an example:

python -m examples.claude_code

Architecture

zai_adk/
├── agent/          # Agent loop with event streaming
├── llm/            # LLM provider abstraction (OpenAI, Anthropic)
├── tools/          # Tool system with dependency injection
├── sandbox/        # Secure execution environments
├── skills/         # Reusable prompt instructions
├── mcp/            # Model Context Protocol integration
├── protocols/      # Protocol converter base interface
│   └── agui/       # AG-UI protocol implementation
├── tokens/         # Usage tracking and cost calculation
└── observability/  # Laminar tracing integration

License

MIT License

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

zai_adk_python_preview-0.1.2.tar.gz (391.2 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

zai_adk_python_preview-0.1.2-py3-none-any.whl (100.8 kB view details)

Uploaded Python 3

File details

Details for the file zai_adk_python_preview-0.1.2.tar.gz.

File metadata

File hashes

Hashes for zai_adk_python_preview-0.1.2.tar.gz
Algorithm Hash digest
SHA256 afeff52549641e4e6df96d091bf612abc1f7e37c8c0b4d8b65aeefdc091b2fa5
MD5 9355992721b3a0c3b1b3b0c4addafd35
BLAKE2b-256 9c7842e18c5f0326646b571ed6ade97eef94d0395d9dba57120f0b7b3f3589e9

See more details on using hashes here.

File details

Details for the file zai_adk_python_preview-0.1.2-py3-none-any.whl.

File metadata

File hashes

Hashes for zai_adk_python_preview-0.1.2-py3-none-any.whl
Algorithm Hash digest
SHA256 f79f2995811d27c0afdc1b29b809e67872f392be014ce4d9449464095e0c5144
MD5 63aa3655bdc9763b49301d30093cff47
BLAKE2b-256 123546249ad8f966f44dbde99b6d6b84f567047a367842bd8a297ec4672d3bbe

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page