Skip to main content

Python SDK for PandaProbe — open source agent engineering platform

Project description

PandaProbe Python SDK

Python SDK for PandaProbe — open source agent engineering platform.

Installation

pip install pandaprobe

With optional LLM provider wrappers:

pip install "pandaprobe[openai]"       # OpenAI wrapper
pip install "pandaprobe[gemini]"       # Google Gemini wrapper
pip install "pandaprobe[anthropic]"    # Anthropic wrapper

With optional agent framework integrations:

pip install "pandaprobe[langgraph]"         # LangGraph
pip install "pandaprobe[langchain]"         # LangChain (create_agent, LCEL)
pip install "pandaprobe[deepagents]"        # DeepAgents (requires Python ≥3.11)
pip install "pandaprobe[google-adk]"        # Google Agent Development Kit
pip install "pandaprobe[claude-agent-sdk]"  # Anthropic Claude Agent SDK
pip install "pandaprobe[crewai]"            # CrewAI
pip install "pandaprobe[openai-agents]"     # OpenAI Agents SDK

Quick Start

1. Set environment variables

export PANDAPROBE_API_KEY="sk_pp_..."
export PANDAPROBE_PROJECT_NAME="my-project"
export PANDAPROBE_ENDPOINT="https://api.pandaprobe.com"   # optional — this is the default
export PANDAPROBE_ENVIRONMENT="production"   # optional
export PANDAPROBE_RELEASE="v1.2.0"           # optional

The SDK auto-initializes from these environment variables on first use — no explicit init() call is needed. To disable tracing, set PANDAPROBE_ENABLED=false.

You can still use pandaprobe.init(...) for programmatic configuration if preferred.

2. Decorator-based tracing (custom agents)

import pandaprobe

@pandaprobe.trace(name="my-agent")
def run_agent(query: str):
    @pandaprobe.span(name="llm-call", kind="LLM")
    def call_llm(prompt):
        return openai_client.chat.completions.create(...)

    @pandaprobe.span(name="search", kind="TOOL")
    def search(q):
        return search_engine.search(q)

    context = search(query)
    return call_llm(f"Context: {context}\nQuery: {query}")

3. OpenAI wrapper (automatic LLM tracing)

from pandaprobe.wrappers import wrap_openai
import openai

client = wrap_openai(openai.OpenAI())

# Chat Completions API — automatically traced:
response = client.chat.completions.create(
    model="gpt-5.4-nano",
    messages=[{"role": "user", "content": "Hello"}],
)

# Responses API — also automatically traced, including reasoning summaries
# and built-in tool calls (web_search, function_call, etc.) as child spans:
response = client.responses.create(
    model="gpt-5.4-nano",
    input="Explain recursion in one sentence.",
    reasoning={"effort": "low", "summary": "auto"},
)

4. Gemini wrapper (automatic LLM tracing)

from pandaprobe.wrappers import wrap_gemini
from google import genai

client = wrap_gemini(genai.Client())

response = client.models.generate_content(
    model="gemini-2.5-flash",
    contents="Explain recursion in one sentence.",
)

5. Anthropic wrapper (automatic LLM tracing)

from pandaprobe.wrappers import wrap_anthropic
import anthropic

client = wrap_anthropic(anthropic.Anthropic())

response = client.messages.create(
    model="claude-sonnet-4-20250514",
    max_tokens=150,
    system="You are a concise assistant.",
    messages=[{"role": "user", "content": "Explain recursion in one sentence."}],
)

6. Agent framework integrations

All integrations below auto-trace agent execution — LLM calls, tool use, handoffs, and more — with no manual span creation.

LangGraph

from pandaprobe.integrations.langgraph import LangGraphCallbackHandler

handler = LangGraphCallbackHandler()
result = graph.invoke(
    {"messages": [HumanMessage(content="hello")]},
    config={"callbacks": [handler]},
)

LangChain

Works with create_agent agents and plain LCEL pipelines (prompt | model | parser).

from langchain.agents import create_agent
from pandaprobe.integrations.langchain import LangChainCallbackHandler

handler = LangChainCallbackHandler()
agent = create_agent(model="openai:gpt-5.4-nano", tools=[...])
result = agent.invoke(
    {"messages": [{"role": "user", "content": "hello"}]},
    config={"callbacks": [handler]},
)

DeepAgents

deepagents is an opinionated harness on top of create_agent. create_deep_agent(...) returns a LangGraph compiled graph, so a single handler captures the parent agent and every sub-agent dispatched via the built-in task tool.

from deepagents import create_deep_agent
from pandaprobe.integrations.deepagents import DeepAgentsCallbackHandler

handler = DeepAgentsCallbackHandler()
agent = create_deep_agent(
    model="openai:gpt-5.4-nano",
    tools=[...],
    system_prompt="...",
    subagents=[{"name": "researcher", "description": "...", "system_prompt": "...", "tools": [...]}],
)
result = agent.invoke(
    {"messages": [{"role": "user", "content": "hello"}]},
    config={"callbacks": [handler]},
)

Google ADK

from pandaprobe.integrations.google_adk import GoogleADKAdapter

adapter = GoogleADKAdapter()
adapter.instrument()

# All Runner.run_async() calls are now traced automatically
result = await runner.run_async(user_id="user-1", session_id="s-1", new_message=msg)

Claude Agent SDK

from pandaprobe.integrations.claude_agent_sdk import ClaudeAgentSDKAdapter

adapter = ClaudeAgentSDKAdapter()
adapter.instrument()

# All client.query() / client.receive_response() calls are now traced automatically
result = client.query(prompt="Explain recursion.")

CrewAI

from pandaprobe.integrations.crewai import CrewAIAdapter

adapter = CrewAIAdapter()
adapter.instrument()

# All crew.kickoff() calls are now traced automatically
result = crew.kickoff()

OpenAI Agents SDK

from pandaprobe.integrations.openai_agents import OpenAIAgentsAdapter

adapter = OpenAIAgentsAdapter()
adapter.instrument()

# All Runner.run() calls are now traced automatically
result = await Runner.run(agent, input="Explain recursion.")

7. Session and user tracking

Group related traces under a session and/or user using the universal context API:

import pandaprobe

# Context managers — scoped to the block
with pandaprobe.session("conversation-123"):
    with pandaprobe.user("user-abc"):
        run_agent("What is recursion?")
        run_agent("Can you give me an example?")

# Imperative — useful for dynamic switching
pandaprobe.set_session("conversation-456")
pandaprobe.set_user("user-xyz")
run_agent("New topic")

Both propagate across all SDK layers (decorators, wrappers, integrations, context managers). Explicit parameters (session_id=, user_id=) take precedence over the context.

8. Programmatic scoring

pandaprobe.score(
    trace_id="...",
    name="user_satisfaction",
    value="0.9",
    data_type="NUMERIC",
    reason="User clicked thumbs up",
)

9. Flushing

For short-lived scripts, call pandaprobe.flush() before exiting to ensure all traces are sent. For long-running processes, the SDK flushes automatically via a background thread and an atexit handler.

pandaprobe.flush()
pandaprobe.shutdown()

Configuration

Environment Variable Default Description
PANDAPROBE_API_KEY (required) API key
PANDAPROBE_PROJECT_NAME (required) Project name
PANDAPROBE_ENDPOINT https://api.pandaprobe.com Backend URL
PANDAPROBE_ENVIRONMENT None Environment tag (e.g. production, staging)
PANDAPROBE_RELEASE None Release/version tag (e.g. v1.2.0)
PANDAPROBE_ENABLED true Enable/disable SDK
PANDAPROBE_BATCH_SIZE 10 Traces per flush batch
PANDAPROBE_FLUSH_INTERVAL 5.0 Seconds between flushes
PANDAPROBE_DEBUG false Verbose logging

Development

make py-install       # Install all deps (providers, examples, dev tools)
make py-lint          # Run linter
make py-format        # Auto-format
make py-test          # Run tests
make py-test-cov      # Tests with coverage

License

MIT

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

pandaprobe-0.3.0.tar.gz (494.6 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

pandaprobe-0.3.0-py3-none-any.whl (85.9 kB view details)

Uploaded Python 3

File details

Details for the file pandaprobe-0.3.0.tar.gz.

File metadata

  • Download URL: pandaprobe-0.3.0.tar.gz
  • Upload date:
  • Size: 494.6 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.13.12

File hashes

Hashes for pandaprobe-0.3.0.tar.gz
Algorithm Hash digest
SHA256 1a3aa3987642d4077a0ab582fcf5042cc905eb545f423cc6a6fbb9c8415299d9
MD5 af648926fdc41e2786e75c62dcf0dd3b
BLAKE2b-256 187e2d4eb1d7406b4c73e7e74d1323bcae97e5871efae95983747b308d682b6a

See more details on using hashes here.

Provenance

The following attestation bundles were made for pandaprobe-0.3.0.tar.gz:

Publisher: release-python.yml on chirpz-ai/pandaprobe-sdk

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file pandaprobe-0.3.0-py3-none-any.whl.

File metadata

  • Download URL: pandaprobe-0.3.0-py3-none-any.whl
  • Upload date:
  • Size: 85.9 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.13.12

File hashes

Hashes for pandaprobe-0.3.0-py3-none-any.whl
Algorithm Hash digest
SHA256 f66e3f284835daeec52b897204a98d8669cf7d3ba1542c4acc2b548328925fbc
MD5 534f0db85170b755462896aeac1f4c8d
BLAKE2b-256 1519237160d5811d9796f07c8def8bfb0eafdebdf1641c8770701e786fe4dec1

See more details on using hashes here.

Provenance

The following attestation bundles were made for pandaprobe-0.3.0-py3-none-any.whl:

Publisher: release-python.yml on chirpz-ai/pandaprobe-sdk

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page