Skip to main content

Self-hosted server for Agent Debugger — FastAPI backend, SQLite/Postgres storage, SSE streaming, and React UI.

Project description

Peaky Peek

Local-first agent debugger with replay, failure memory, smart highlights, and drift detection.

pip install peaky-peek-server && peaky-peek --open

Local-first, open-source agent debugger. Capture decisions, replay from checkpoints, visualize reasoning trees — all on your machine, no data sent anywhere.

PyPI PyPI Server Python 3.10+ License CI Downloads


Peaky Peek demo walkthrough

Why Peaky Peek?

Traditional observability tools weren't built for agent-native debugging:

Tool Focus Problem
LangSmith LLM tracing SaaS-first, your data leaves your machine
OpenTelemetry Infra metrics Blind to reasoning chains and decision trees
Sentry Error tracking No insight into why agents chose specific actions
Peaky Peek Agent-native debugging Local-first, open source, privacy by default

Peaky Peek captures the causal chain behind every action so you can debug agents like distributed systems: trace failures, replay from checkpoints, and search across reasoning paths.


Quick Start

Option 1: Decorator (simplest)

pip install peaky-peek-server
peaky-peek --open   # launches API + UI at http://localhost:8000
from agent_debugger_sdk import trace

@trace
async def my_agent(prompt: str) -> str:
    # Your agent logic here — traces are captured automatically
    return await llm_call(prompt)

Option 2: Context Manager

from agent_debugger_sdk import trace_session

async with trace_session("weather_agent") as ctx:
    await ctx.record_decision(
        reasoning="User asked for weather",
        confidence=0.9,
        chosen_action="call_weather_api",
        evidence=[{"source": "user_input", "content": "What's the weather?"}],
    )
    await ctx.record_tool_call("weather_api", {"city": "Seattle"})
    await ctx.record_tool_result("weather_api", result={"temp": 52, "forecast": "rain"})

Option 3: Zero-Config Auto-Patch (no code changes)

# Set env var, then run your agent normally
PEAKY_PEEK_AUTO_PATCH=true python my_agent.py

Works with PydanticAI, LangChain, OpenAI SDK, CrewAI, AutoGen, LlamaIndex, and Anthropic — no imports or decorators needed.


Framework Integrations

PydanticAI

from pydantic_ai import Agent
from agent_debugger_sdk import init
from agent_debugger_sdk.adapters import PydanticAIAdapter

init()

agent = Agent("openai:gpt-4o")
adapter = PydanticAIAdapter(agent, agent_name="support_agent")

LangChain

from agent_debugger_sdk import init
from agent_debugger_sdk.adapters import LangChainTracingHandler

init()

handler = LangChainTracingHandler(session_id="my-session")
# Pass handler to your LangChain agent's callbacks

OpenAI SDK

No code needed — just set the environment variable:

PEAKY_PEEK_AUTO_PATCH=true python my_openai_agent.py

Or use the simplified decorator:

from agent_debugger_sdk import trace

@trace(name="openai_agent", framework="openai")
async def my_agent(prompt: str) -> str:
    client = openai.AsyncOpenAI()
    response = await client.chat.completions.create(
        model="gpt-4o", messages=[{"role": "user", "content": prompt}]
    )
    return response.choices[0].message.content

Auto-Patch (Any Framework)

import agent_debugger_sdk.auto_patch  # activates on import when PEAKY_PEEK_AUTO_PATCH is set

# Now run your agent normally — all LLM calls are traced automatically

Features

Decision Tree Visualization

Decision Tree visualization demo

Navigate agent reasoning as an interactive tree. Click nodes to inspect events, zoom to explore complex flows, and trace the causal chain from policy to tool call to safety check.

Checkpoint Replay

Checkpoint replay demo

Time-travel through agent execution with checkpoint-aware playback. Play, pause, step, and seek to any point in the trace. Checkpoints are ranked by restore value so you jump to the most useful state.

Trace Search

Find specific events across all sessions. Search by keyword, filter by event type, and jump directly to results.

Failure Clustering & Multi-Agent Coordination

Trace search demo

Analytics demo

Adaptive analysis groups similar failures. Inspect planner/critic debates, speaker topology, and prompt policy parameters across multi-agent systems.

Session Comparison

Compare two agent runs side-by-side. See diffs in turn count, speaker topology, policies, stance shifts, and grounded decisions.


Privacy & Security

  • Local-first by default — no external telemetry, no data leaves your machine
  • Zero-config auto-patching — no credentials or API keys needed for local debugging
  • Optional redaction pipeline — prompts, payloads, PII regex
  • API key authentication — bcrypt hashing
  • GDPR/HIPAA friendly — SQLite storage, no cloud dependency

Deployment

pip (recommended)

pip install peaky-peek-server
peaky-peek --open

Docker

docker build -t peaky-peek .
docker run -p 8000:8000 -v ./traces:/app/traces peaky-peek

Development

git clone https://github.com/acailic/agent_debugger
cd agent_debugger
pip install -e ".[dev]"
pip install fastapi "uvicorn[standard]" "sqlalchemy[asyncio]" aiosqlite alembic aiofiles bcrypt
python3 -m pytest -q
cd frontend && npm install && npm run build

Architecture

System Overview

flowchart TB
    classDef layer fill:#0f172a,stroke:#334155,color:#e2e8f0,stroke-width:2px
    classDef ext fill:none,stroke:#94a3b8,stroke-dasharray:6 3,color:#94a3b8

    AGENT("🤖  Your Agent Code"):::ext

    subgraph RUNTIME[" "]
        direction TB
        SDK["<b>🔌  SDK Layer</b><br/><small>Instrument & capture</small><br/><sub>@trace · TraceContext · Auto-Patch · Adapters</sub>"]:::layer
        INTEL["<b>🧠  Intelligence</b><br/><small>Detect, remember, alert</small><br/><sub>Event Buffer · Pattern Detector · Failure Memory · Replay Engine</sub>"]:::layer
    end

    subgraph SERVER[" "]
        direction TB
        API["<b>🌐  API Server</b><br/><small>FastAPI + SSE</small><br/><sub>11 routers: sessions · traces · replay · search · analytics · compare</sub>"]:::layer
        STORE["<b>💾  Storage</b><br/><small>SQLite WAL · async</small><br/><sub>Events · Checkpoints · Analytics · Embeddings</sub>"]:::layer
    end

    UI["<b>🖥️  Frontend</b><br/><small>React · TypeScript · Vite</small><br/><sub>8 panels: decision tree · timeline · tools · replay · search · analytics · compare · live</sub>"]:::layer

    AGENT ==>|"decorate"| SDK
    SDK ==>|"emit"| INTEL
    INTEL -->|"persist"| STORE
    SDK -.->|"ingest"| API
    API <-->|"query"| STORE
    API ==>|"SSE stream"| UI
    INTEL -.->|"replay"| API

Layer Detail

flowchart LR
    classDef sdk fill:#4f46e5,stroke:#3730a3,color:#fff,stroke-width:2px
    classDef intel fill:#dc2626,stroke:#b91c1c,color:#fff,stroke-width:2px
    classDef api fill:#059669,stroke:#047857,color:#fff,stroke-width:2px
    classDef store fill:#b45309,stroke:#92400e,color:#fff,stroke-width:2px
    classDef ui fill:#7c3aed,stroke:#6d28d9,color:#fff,stroke-width:2px

    subgraph SDK[" 🔌  SDK "]
        direction TB
        DEC["@trace decorator"]:::sdk
        CTX["TraceContext"]:::sdk
        AP["Auto-Patch"]:::sdk
        AD["Framework Adapters"]:::sdk
    end

    subgraph INT[" 🧠  Intelligence "]
        direction TB
        BUF["Event Buffer"]:::intel
        PAT["Pattern Detector"]:::intel
        FMEM["Failure Memory"]:::intel
        ALERT["Alert Engine"]:::intel
        RPLAY["Replay Engine"]:::intel
    end

    subgraph APIL[" 🌐  API "]
        direction TB
        R1["Sessions · Traces"]:::api
        R2["Replay · Search"]:::api
        R3["Analytics · Compare"]:::api
        SSE["SSE Stream"]:::api
    end

    subgraph STO[" 💾  Storage "]
        direction TB
        DB[("SQLite WAL")]:::store
        S1["Events · Checkpoints"]:::store
        S2["Analytics Aggregations"]:::store
    end

    subgraph UIF[" 🖥️  Frontend "]
        direction TB
        DT["Decision Tree"]:::ui
        TL["Trace Timeline"]:::ui
        TI["Tool Inspector"]:::ui
        RP["Session Replay"]:::ui
        SE["Cross-session Search"]:::ui
        AN["Analytics Dashboard"]:::ui
    end

    DEC & CTX --> BUF
    AP & AD --> BUF
    BUF --> PAT & FMEM & ALERT
    BUF --> S1
    S1 --> DB
    DB --> S2
    R1 & R2 & R3 <--> S1
    RPLAY --> R2
    SSE --> DT & TL & RP

See ARCHITECTURE.md for full module breakdown.


Project Status

  • Core debugger — local path end-to-end, stable
  • SDK@trace, trace_session(), auto-patch for 7 frameworks
  • API — 11 routers: sessions, traces, replay, search, analytics, cost, comparison
  • Frontend — 8 specialized panels (decision tree, replay, checkpoints, search)
  • Tests — 365+ passing, CI on Python 3.10/3.11/3.12

Scientific Foundations

Peaky Peek is informed by research on agent debugging, causal tracing, failure analysis, and adaptive replay. See paper notes for design takeaways from each.

Documentation


Contributing

Contributions are welcome! See CONTRIBUTING.md for guidelines.


License

MIT

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

peaky_peek_server-0.1.17.tar.gz (60.3 MB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

peaky_peek_server-0.1.17-py3-none-any.whl (856.7 kB view details)

Uploaded Python 3

File details

Details for the file peaky_peek_server-0.1.17.tar.gz.

File metadata

  • Download URL: peaky_peek_server-0.1.17.tar.gz
  • Upload date:
  • Size: 60.3 MB
  • Tags: Source
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.13.7

File hashes

Hashes for peaky_peek_server-0.1.17.tar.gz
Algorithm Hash digest
SHA256 3d98e2717f670339173c87066bbccfb4bf79d6e5a303ae2c604e86491683c603
MD5 6265c34021b2b29d5d06c7d0e59834f3
BLAKE2b-256 34472cbf82b7a5fd3048f11b2d711732f28a7e00fdd3b0f11cd04892e0e0eba2

See more details on using hashes here.

Provenance

The following attestation bundles were made for peaky_peek_server-0.1.17.tar.gz:

Publisher: publish.yml on acailic/agent_debugger

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file peaky_peek_server-0.1.17-py3-none-any.whl.

File metadata

File hashes

Hashes for peaky_peek_server-0.1.17-py3-none-any.whl
Algorithm Hash digest
SHA256 ae4c808c285c8e647ed70811aea67900ccd41ec66761b159a9bc8364d4d98fc0
MD5 480513f1ca7dc5ae1236b9ec2372f5cd
BLAKE2b-256 8db14b7211d0bcec863eb7e6aff6a6b9dc15fd8f7531dc0bdc0e37d22b0f7c41

See more details on using hashes here.

Provenance

The following attestation bundles were made for peaky_peek_server-0.1.17-py3-none-any.whl:

Publisher: publish.yml on acailic/agent_debugger

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page