Skip to main content

Official Python SDK for MonkAI - Track and analyze your AI agent conversations

Project description

MonkAI Trace - Python SDK

Official Python client for MonkAI - Monitor, analyze, and optimize your AI agents.

PyPI version Python 3.8+ License: MIT

Features

  • Upload conversation records with full token segmentation
  • Track 4 token types: input, output, process, memory
  • Async support via AsyncMonkAIClient (aiohttp-based)
  • Retry with exponential backoff on transient failures
  • Batch processing with automatic chunking
  • Upload from JSON files (supports your existing data)
  • Session management with automatic cleanup and configurable timeouts
  • Data export - Query records/logs with filters, export to JSON or CSV
  • Structured logging via Python logging module
  • HTTP REST API - Language-agnostic tracing for any runtime
  • Framework Integrations:
    • MonkAI Agent - Native framework with automatic tracking
    • LangChain - Full callback handler support (v0.2+)
    • OpenAI Agents - RunHooks integration
    • Python Logging - Standard logging handler with custom_object metadata

Installation

pip install monkai-trace

For framework integrations:

# MonkAI Agent (Native Framework)
pip install monkai-trace monkai-agent

# LangChain
pip install monkai-trace langchain

# OpenAI Agents
pip install monkai-trace openai-agents-python

Quick Start

Basic Usage

from monkai_trace import MonkAIClient

client = MonkAIClient(tracer_token="tk_your_token")

client.upload_record(
    namespace="customer-support",
    agent="support-bot",
    messages=[
        {"role": "user", "content": "Hello"},
        {"role": "assistant", "content": "Hi! How can I help?"}
    ],
    input_tokens=5,
    output_tokens=10,
    process_tokens=100,
    memory_tokens=20
)

Async Client

from monkai_trace import AsyncMonkAIClient

async def main():
    client = AsyncMonkAIClient(tracer_token="tk_your_token")
    await client.upload_record(
        namespace="my-agent",
        agent="assistant",
        messages=[{"role": "user", "content": "Hello"}],
        input_tokens=5,
        output_tokens=10
    )
    await client.close()

OpenAI Agents Integration

from agents import Agent, WebSearchTool
from monkai_trace.integrations.openai_agents import MonkAIRunHooks

hooks = MonkAIRunHooks(
    tracer_token="tk_your_token",
    namespace="my-agent",
    batch_size=1
)

agent = Agent(
    name="Assistant",
    instructions="You are helpful",
    tools=[WebSearchTool()]
)

hooks.set_user_id("user_abc123")
hooks.set_user_name("João Silva")
hooks.set_user_channel("whatsapp")

result = await MonkAIRunHooks.run_with_tracking(agent, "Hello!", hooks)

LangChain Integration

from langchain.agents import initialize_agent, load_tools
from langchain.llms import OpenAI
from monkai_trace.integrations.langchain import MonkAICallbackHandler

handler = MonkAICallbackHandler(
    tracer_token="tk_your_token",
    namespace="my-agents"
)

llm = OpenAI(temperature=0)
tools = load_tools(["serpapi"], llm=llm)
agent = initialize_agent(tools, llm, callbacks=[handler])
agent.run("What is the weather in Tokyo?")

MonkAI Agent Framework

from monkai_agent import Agent
from monkai_trace.integrations.monkai_agent import MonkAIAgentHooks

hooks = MonkAIAgentHooks(
    tracer_token="tk_your_token",
    namespace="my-namespace"
)

agent = Agent(
    name="Support Bot",
    instructions="You are a helpful assistant",
    hooks=hooks
)

result = agent.run("Help me with my order")

Upload from JSON Files

client.upload_records_from_json("records.json")
client.upload_logs_from_json("logs.json", namespace="my-agent")

Query & Export Data

result = client.query_records(
    namespace="customer-support",
    agent="Support Bot",
    start_date="2025-01-01",
    limit=50
)

client.export_records(
    namespace="customer-support",
    output_file="conversations.json"
)

client.export_logs(
    namespace="my-agent",
    level="error",
    format="csv",
    output_file="errors.csv"
)

HTTP REST API (Language-Agnostic)

For non-Python runtimes or direct HTTP calls:

import requests

MONKAI_API = "https://lpvbvnqrozlwalnkvrgk.supabase.co/functions/v1/monkai-api"
TOKEN = "tk_your_token"

session = requests.post(
    f"{MONKAI_API}/sessions/create",
    headers={"tracer_token": TOKEN, "Content-Type": "application/json"},
    json={"namespace": "my-agent", "user_id": "user123"}
).json()

requests.post(
    f"{MONKAI_API}/traces/llm",
    headers={"tracer_token": TOKEN, "Content-Type": "application/json"},
    json={
        "session_id": session["session_id"],
        "model": "gpt-4",
        "input": {"messages": [{"role": "user", "content": "Hello"}]},
        "output": {"content": "Hi!", "usage": {"prompt_tokens": 5, "completion_tokens": 3}}
    }
)

See HTTP REST API Guide for complete documentation.

Session Management

MonkAI automatically manages user sessions with configurable timeouts:

  • Default timeout: 2 minutes of inactivity
  • Automatic cleanup: Background thread removes expired sessions
  • Multi-user support: Each user gets isolated sessions
  • Persistent sessions: Optional file-backed session storage with LRU caching
hooks = MonkAIRunHooks(
    tracer_token="tk_your_token",
    namespace="support",
    inactivity_timeout=120
)
hooks.set_user_id("customer-12345")

See Session Management Guide for details.

Token Segmentation

Track 4 token types to understand LLM costs:

Type Description
Input User queries and prompts
Output Agent responses and completions
Process System prompts, instructions, tool definitions
Memory Conversation history and context
client.upload_record(
    namespace="analytics",
    agent="data-agent",
    messages={"role": "user", "content": "Analyze this"},
    input_tokens=15,
    output_tokens=200,
    process_tokens=500,
    memory_tokens=100
)

Examples

See the examples/ directory:

Example Description
openai_agents_example.py OpenAI Agents basic integration
openai_agents_multi_agent.py Multi-agent handoff patterns
monkai_agent_example.py MonkAI Agent framework
langchain_example.py LangChain integration
langchain_conversational.py LangChain with memory
logging_example.py Python logging (scripts)
service_logging_example.py Python logging (long-running services)
session_management_basic.py Automatic session creation
session_management_multi_user.py WhatsApp bot with concurrent users
session_management_custom_timeout.py Custom timeout configuration
http_rest_basic.py HTTP REST API basic usage
http_rest_async.py Async HTTP REST client
http_rest_openai.py OpenAI + HTTP REST tracing
export_data.py Query and export data to JSON/CSV
send_json_files.py Upload from JSON files

See examples/README.md for the full guide.

Documentation

Development

git clone https://github.com/BeMonkAI/monkai-trace.git
cd monkai-trace

pip install -e ".[dev]"

pytest tests/ -x -q

Requirements

  • Python 3.8+
  • requests >= 2.32.2
  • pydantic >= 2.0.0
  • aiohttp (optional, for AsyncMonkAIClient)
  • monkai-agent (optional, for MonkAI Agent integration)
  • langchain (optional, for LangChain integration)
  • openai-agents-python (optional, for OpenAI Agents integration)

Changelog

v0.2.18

  • Updated README and project URLs
  • Synchronized repository metadata

v0.2.17

  • Security: Patched requests dependency (CVE fix, now >= 2.32.2)
  • Security: Added .env files to .gitignore
  • Security: Replaced bare except: with specific exception handling
  • Reliability: Added retry with exponential backoff on all HTTP requests
  • Reliability: Added CI test gate before PyPI publish
  • Usability: Unified async client (base URL, auth headers, endpoints)
  • Usability: Exported AsyncMonkAIClient from package __init__
  • Usability: Fixed TokenUsage.total_tokens auto-calculation
  • Scalability: Automatic session cleanup via background thread
  • Scalability: Microsecond-precision session IDs to prevent collisions
  • Quality: Migrated all print() calls to logging module

License

MIT License - see LICENSE file.

Support

Contributing

Contributions welcome! Please read our Contributing Guide first.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

monkai_trace-0.2.18.tar.gz (51.1 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

monkai_trace-0.2.18-py3-none-any.whl (36.8 kB view details)

Uploaded Python 3

File details

Details for the file monkai_trace-0.2.18.tar.gz.

File metadata

  • Download URL: monkai_trace-0.2.18.tar.gz
  • Upload date:
  • Size: 51.1 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.11.14

File hashes

Hashes for monkai_trace-0.2.18.tar.gz
Algorithm Hash digest
SHA256 9a9a09486d54b3a973ae994264354f1d0bdb649032443a778f3bdd862c99562f
MD5 351987cab8e7c45c67a30b63aa0f7c60
BLAKE2b-256 df423f1a74b3d4786a8ac70a085fb7ab94ab8969be71f0a58312061f70299537

See more details on using hashes here.

File details

Details for the file monkai_trace-0.2.18-py3-none-any.whl.

File metadata

  • Download URL: monkai_trace-0.2.18-py3-none-any.whl
  • Upload date:
  • Size: 36.8 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.11.14

File hashes

Hashes for monkai_trace-0.2.18-py3-none-any.whl
Algorithm Hash digest
SHA256 2f15ea0e06ea884dc3d507c5d627a3c45a5a7f9a2e700674a6d329200535f08d
MD5 6d250e46d2bed71d8c405ff9ee371163
BLAKE2b-256 d5f3667d91229895add4527e7b44f41251e03a5aef48cdb0b4c432fcc4303ccd

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page