Skip to main content

Oso observability integration for LangChain agents

Project description

langchain-oso

Oso observability integration for LangChain agents.

Callback handler that automatically captures and sends all LangChain agent events to Oso's observability platform for monitoring, debugging, and security analysis.

Installation

pip install langchain-oso

Or with poetry:

poetry add langchain-oso

Quick Start

from langchain_oso import OsoObservabilityCallback
from langchain.agents import create_openai_tools_agent, AgentExecutor
from langchain_openai import ChatOpenAI

# Create the callback (reads OSO_AUTH_TOKEN from environment)
callback = OsoObservabilityCallback(
    agent_id="my-support-agent"
)

# Add to your agent
llm = ChatOpenAI(model="gpt-4")
agent = create_openai_tools_agent(llm, tools, prompt)
agent_executor = AgentExecutor(agent=agent, tools=tools, callbacks=[callback])

# Use your agent - all events are automatically captured
result = await agent_executor.ainvoke({"input": "Hello, how can I help?"})

# Clean up
await callback.aclose()

Using Context Manager (Recommended)

async with OsoObservabilityCallback(agent_id="my-agent") as callback:
    agent_executor = AgentExecutor(agent=agent, tools=tools, callbacks=[callback])
    result = await agent_executor.ainvoke({"input": "Hello"})
    # Automatic cleanup when context exits

Configuration

Environment Variables

Set these in your environment or .env file:

# Required: Your Oso authentication token
OSO_AUTH_TOKEN=your-token-here

# Optional: Custom Oso endpoint (defaults to https://cloud.osohq.com/api/events)
OSO_ENDPOINT=https://cloud.osohq.com/api/events

# Optional: Enable/disable observability (defaults to true)
OSO_OBSERVABILITY_ENABLED=true

Constructor Parameters

OsoObservabilityCallback(
    endpoint="https://cloud.osohq.com/api/events",  # Oso endpoint URL
    auth_token="your-token",                         # Oso auth token
    enabled=True,                                    # Enable/disable sending events
    session_id="unique-session-id",                  # Group related conversations
    metadata={"user_id": "123", "env": "prod"},     # Custom metadata for all events
    agent_id="my-agent"                              # Agent identifier
)

All parameters are optional and fall back to environment variables or defaults.

What Gets Captured

The callback automatically captures all LangChain events:

LLM Events

  • Model name and configuration
  • Prompts sent to the LLM
  • Generated responses
  • Token usage (prompt, completion, total)
  • Errors and failures

Tool Events

  • Tool name and description
  • Input parameters
  • Output/results
  • Execution duration (milliseconds)
  • Errors and stack traces

Agent Events

  • Agent reasoning and thought process
  • Tool selection decisions
  • Tool input parameters
  • Final outputs
  • Complete execution flow

Chain Events

  • Chain type and name
  • Input parameters
  • Output values
  • Nested chain execution

Execution Summary

At the end of each agent execution, a summary event is sent with:

  • Total execution duration
  • Number of LLM calls, tool calls, and agent steps
  • Total token usage
  • Error count
  • Complete execution trace

Event Structure

Every event sent to Oso has this structure:

{
  "event_type": "tool.completed",
  "execution_id": "unique-execution-id",
  "session_id": "conversation-session-id",
  "timestamp": "2024-02-15T10:30:45.123Z",
  "agent_id": "my-agent",
  "data": { /* event-specific data */ },
  "metadata": { /* your custom metadata */ }
}

Event Types

  • llm.started / llm.completed / llm.error
  • tool.started / tool.completed / tool.error
  • agent.action / agent.finished
  • chain.started / chain.completed / chain.error
  • execution.summary - Final summary with all accumulated data

Error Handling

The callback is designed to fail gracefully:

  • Network errors or timeouts won't crash your agent
  • Failed event sends are logged but don't interrupt execution
  • 5-second timeout for all HTTP requests
  • Comprehensive error logging for debugging

Logging

The callback uses Python's standard logging. Configure it in your application:

import logging

# See debug info about events being sent
logging.basicConfig(level=logging.DEBUG)

# Or just warnings and errors
logging.basicConfig(level=logging.WARNING)

# Or configure just this library
logging.getLogger("langchain_oso").setLevel(logging.DEBUG)

Examples

Basic Agent with Tools

from langchain_oso import OsoObservabilityCallback
from langchain.agents import create_openai_tools_agent, AgentExecutor
from langchain_openai import ChatOpenAI
from langchain.tools import tool

@tool
def search_orders(customer_id: str) -> str:
    """Search for customer orders."""
    return f"Orders for {customer_id}: ORD001, ORD002"

async def main():
    async with OsoObservabilityCallback(agent_id="support-agent") as callback:
        llm = ChatOpenAI(model="gpt-4o-mini")
        tools = [search_orders]

        agent = create_openai_tools_agent(llm, tools, prompt)
        agent_executor = AgentExecutor(
            agent=agent,
            tools=tools,
            callbacks=[callback]
        )

        result = await agent_executor.ainvoke({
            "input": "Find orders for customer CUST001"
        })

        print(result["output"])

With Custom Metadata

callback = OsoObservabilityCallback(
    agent_id="support-agent",
    session_id="user-session-123",
    metadata={
        "user_id": "user-456",
        "environment": "production",
        "version": "1.2.3"
    }
)

Multiple Agents in Same Session

session_id = str(uuid.uuid4())

# Agent 1
async with OsoObservabilityCallback(
    agent_id="agent-1",
    session_id=session_id
) as callback1:
    result1 = await agent1_executor.ainvoke(...)

# Agent 2 - same session
async with OsoObservabilityCallback(
    agent_id="agent-2",
    session_id=session_id
) as callback2:
    result2 = await agent2_executor.ainvoke(...)

Requirements

  • Python 3.10+
  • langchain-core >= 0.1.0
  • httpx >= 0.24.0

License

Apache License 2.0

Support

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

langchain_oso-0.1.0.tar.gz (10.0 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

langchain_oso-0.1.0-py3-none-any.whl (11.4 kB view details)

Uploaded Python 3

File details

Details for the file langchain_oso-0.1.0.tar.gz.

File metadata

  • Download URL: langchain_oso-0.1.0.tar.gz
  • Upload date:
  • Size: 10.0 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.13.7

File hashes

Hashes for langchain_oso-0.1.0.tar.gz
Algorithm Hash digest
SHA256 76117c849fe97d5444734a73beb024be39485d8e76d0da07fe6e44f8dfb67bcf
MD5 255a3cd126c869b1eb0e24fbab7d76ad
BLAKE2b-256 49b49e15aad3efe82fc364a0ea61671edbfdfd2576e17c796cc7e719dae358a2

See more details on using hashes here.

Provenance

The following attestation bundles were made for langchain_oso-0.1.0.tar.gz:

Publisher: publish.yml on osohq/langchain-oso

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file langchain_oso-0.1.0-py3-none-any.whl.

File metadata

  • Download URL: langchain_oso-0.1.0-py3-none-any.whl
  • Upload date:
  • Size: 11.4 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.13.7

File hashes

Hashes for langchain_oso-0.1.0-py3-none-any.whl
Algorithm Hash digest
SHA256 a9cebc7a92cfee9897737fb6b0fe7a634c3dc8f7500a5488a820020af6d80e56
MD5 a7da57ef2e77618e10d7c94d1139bd30
BLAKE2b-256 809139436ff3e2eb3f6040aae7a2500856e5d29dd54ad0696cfd3c96131ea45c

See more details on using hashes here.

Provenance

The following attestation bundles were made for langchain_oso-0.1.0-py3-none-any.whl:

Publisher: publish.yml on osohq/langchain-oso

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page