OpenTelemetry tracing processor for the OpenAI Agents SDK
Project description
openai-agents-opentelemetry
OpenTelemetry tracing processor for the OpenAI Agents SDK.
Bridges agent traces to any OTLP-compatible backend (Jaeger, Datadog, Honeycomb, Grafana Tempo, Langfuse, etc.).
Installation
pip install openai-agents-opentelemetry
Quick Start
from agents import Agent, Runner, add_trace_processor
from openai_agents_opentelemetry import OpenTelemetryTracingProcessor
# Create and register the OpenTelemetry processor
otel_processor = OpenTelemetryTracingProcessor()
add_trace_processor(otel_processor)
# Now all agent traces will be exported to your configured OTel backend
agent = Agent(name="Assistant", instructions="You are helpful.")
result = await Runner.run(agent, "Hello!")
Using OpenTelemetry Only (No OpenAI Backend)
If you want traces to go only to your OpenTelemetry backend:
from agents import set_trace_processors
from openai_agents_opentelemetry import OpenTelemetryTracingProcessor
# Replace the default processor entirely
otel_processor = OpenTelemetryTracingProcessor()
set_trace_processors([otel_processor])
Span Mapping
The processor maps SDK spans to OpenTelemetry spans following OpenTelemetry Semantic Conventions for GenAI:
| SDK Span Type | OTel Span Name | Key Attributes |
|---|---|---|
| Agent | agent: {name} |
agent.name, agent.tools, agent.handoffs |
| Generation | chat {model} |
gen_ai.operation.name, gen_ai.provider.name, gen_ai.request.model, gen_ai.usage.* |
| Function | execute_tool {name} |
gen_ai.tool.name, gen_ai.tool.call.arguments, gen_ai.tool.call.result |
| Handoff | handoff: {from} -> {to} |
agent.handoff.from, agent.handoff.to |
| Guardrail | guardrail: {name} |
agent.guardrail.triggered |
| Response | gen_ai.response |
gen_ai.response.id, gen_ai.response.model |
Configuration
Content Capture Configuration
Control what content is captured for privacy and compliance using ProcessorConfig:
from openai_agents_opentelemetry import OpenTelemetryTracingProcessor, ProcessorConfig
config = ProcessorConfig(
capture_prompts=True, # Capture prompt content as span events
capture_completions=True, # Capture completion content as span events
capture_tool_inputs=True, # Capture tool input arguments
capture_tool_outputs=True, # Capture tool output results
max_attribute_length=4096, # Max length for span attributes
max_event_length=8192, # Max length for span event attributes
baggage_keys=["user.id", "session.id"], # Propagate context to spans
)
processor = OpenTelemetryTracingProcessor(config=config)
Content Filtering / PII Redaction
Apply custom filtering to redact sensitive data before capture:
import re
from openai_agents_opentelemetry import OpenTelemetryTracingProcessor, ProcessorConfig
def redact_pii(content: str, context: str) -> str:
"""Custom PII redaction callback."""
# Redact SSNs
content = re.sub(r"\b\d{3}-\d{2}-\d{4}\b", "[SSN REDACTED]", content)
# Redact email addresses
content = re.sub(r"\b[\w.-]+@[\w.-]+\.\w+\b", "[EMAIL REDACTED]", content)
return content
config = ProcessorConfig(
capture_prompts=True,
capture_completions=True,
content_filter=redact_pii,
)
processor = OpenTelemetryTracingProcessor(config=config)
OpenTelemetry SDK Configuration
The processor uses the globally configured OpenTelemetry TracerProvider. Configure it as you normally would:
from opentelemetry import trace
from opentelemetry.sdk.trace import TracerProvider
from opentelemetry.sdk.trace.export import BatchSpanProcessor
from opentelemetry.exporter.otlp.proto.grpc.trace_exporter import OTLPSpanExporter
# Configure OpenTelemetry
provider = TracerProvider()
processor = BatchSpanProcessor(OTLPSpanExporter(endpoint="http://localhost:4317"))
provider.add_span_processor(processor)
trace.set_tracer_provider(provider)
# Then add the Agents SDK processor
from agents import add_trace_processor
from openai_agents_opentelemetry import OpenTelemetryTracingProcessor
add_trace_processor(OpenTelemetryTracingProcessor())
Resource Helper
Use the create_resource helper for standard resource attributes:
from opentelemetry import trace
from opentelemetry.sdk.trace import TracerProvider
from openai_agents_opentelemetry import create_resource
resource = create_resource(
service_name="my-agent-service",
service_version="1.0.0",
additional_attributes={"deployment.environment": "production"},
)
provider = TracerProvider(resource=resource)
trace.set_tracer_provider(provider)
Span Events
The processor automatically adds span events for content capture (controlled by ProcessorConfig):
| Span Type | Event Name | Attributes |
|---|---|---|
| Generation | gen_ai.content.prompt |
gen_ai.prompt |
| Generation | gen_ai.content.completion |
gen_ai.completion |
| Function | gen_ai.tool.input |
gen_ai.tool.call.arguments |
| Function | gen_ai.tool.output |
gen_ai.tool.call.result |
| Guardrail | guardrail.evaluated |
guardrail.name, guardrail.triggered |
| Handoff | handoff.executed |
handoff.from, handoff.to |
Metrics
Enable metrics collection for token usage, operation duration, and agent-specific counters:
from openai_agents_opentelemetry import OpenTelemetryTracingProcessor
# Enable metrics collection
processor = OpenTelemetryTracingProcessor(enable_metrics=True)
Standard OTel GenAI Metrics
| Metric Name | Type | Unit | Description |
|---|---|---|---|
gen_ai.client.token.usage |
Histogram | {token} |
Token consumption (input/output) |
gen_ai.client.operation.duration |
Histogram | s |
LLM call duration |
Agent-Specific Metrics
| Metric Name | Type | Unit | Description |
|---|---|---|---|
agent.tool.invocations |
Counter | {invocation} |
Tool/function call count by name |
agent.handoffs |
Counter | {handoff} |
Agent handoff count |
agent.guardrail.triggers |
Counter | {trigger} |
Guardrail trigger count by name |
agent.errors |
Counter | {error} |
Error count by type |
Configuring Metrics with OpenTelemetry SDK
Use create_metrics_views() to configure histogram bucket boundaries according to the OpenTelemetry Semantic Conventions for GenAI. Without these views, the SDK uses default buckets that are not suitable for GenAI workloads:
- Token counts can range from 1 to millions (large context windows like GPT-4 128k or Claude 200k)
- Operation durations follow different patterns than typical HTTP requests
from opentelemetry import metrics
from opentelemetry.sdk.metrics import MeterProvider
from opentelemetry.sdk.metrics.export import PeriodicExportingMetricReader
from opentelemetry.exporter.otlp.proto.grpc.metric_exporter import OTLPMetricExporter
from openai_agents_opentelemetry import (
OpenTelemetryTracingProcessor,
create_metrics_views,
)
# Configure metrics with recommended GenAI histogram buckets
exporter = OTLPMetricExporter(endpoint="http://localhost:4317")
reader = PeriodicExportingMetricReader(exporter)
views = create_metrics_views() # Returns views with OTel GenAI recommended bucket boundaries
provider = MeterProvider(metric_readers=[reader], views=views)
metrics.set_meter_provider(provider)
# Then enable metrics in the processor
from agents import add_trace_processor
add_trace_processor(OpenTelemetryTracingProcessor(enable_metrics=True))
The create_metrics_views() function configures:
gen_ai.client.token.usage: Buckets[1, 4, 16, 64, 256, 1024, 4096, 16384, 65536, 262144, 1048576, 4194304, 16777216, 67108864]gen_ai.client.operation.duration: Buckets[0.01, 0.02, 0.04, 0.08, 0.16, 0.32, 0.64, 1.28, 2.56, 5.12, 10.24, 20.48, 40.96, 81.92]
You can also access these bucket constants directly if needed:
from openai_agents_opentelemetry import TOKEN_BUCKETS, DURATION_BUCKETS
Context Propagation (Baggage)
Propagate context like user IDs or session IDs across all agent spans using OpenTelemetry baggage:
from opentelemetry import baggage, context
from openai_agents_opentelemetry import OpenTelemetryTracingProcessor, ProcessorConfig
# Configure which baggage keys to read and add as span attributes
config = ProcessorConfig(
baggage_keys=["user.id", "session.id", "tenant.id"]
)
processor = OpenTelemetryTracingProcessor(config=config)
# Set baggage before running agents
ctx = baggage.set_baggage("user.id", "user-123")
ctx = baggage.set_baggage("session.id", "session-456", context=ctx)
with context.attach(ctx):
# All spans created during this agent run will have user.id and session.id attributes
result = await Runner.run(agent, "Hello!")
This enables filtering traces by user, session, or tenant in your observability backend.
Compatibility
This package is tested weekly against the latest OpenAI Agents SDK to ensure compatibility.
License
MIT
Project details
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file openai_agents_opentelemetry-0.2.1.tar.gz.
File metadata
- Download URL: openai_agents_opentelemetry-0.2.1.tar.gz
- Upload date:
- Size: 153.7 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: uv/0.9.28 {"installer":{"name":"uv","version":"0.9.28","subcommand":["publish"]},"python":null,"implementation":{"name":null,"version":null},"distro":{"name":"Ubuntu","version":"24.04","id":"noble","libc":null},"system":{"name":null,"release":null},"cpu":null,"openssl_version":null,"setuptools_version":null,"rustc_version":null,"ci":true}
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
24e56de98432449a263f6c6070590dbd3a3a597fcee94e229ca416a98e16ea0a
|
|
| MD5 |
b85810003946b220df453f71c89ba7b1
|
|
| BLAKE2b-256 |
6aa0c98fa83c57c67f832fe72dccf3cb827f855d9befb146e3259263cd5e3a53
|
File details
Details for the file openai_agents_opentelemetry-0.2.1-py3-none-any.whl.
File metadata
- Download URL: openai_agents_opentelemetry-0.2.1-py3-none-any.whl
- Upload date:
- Size: 18.8 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: uv/0.9.28 {"installer":{"name":"uv","version":"0.9.28","subcommand":["publish"]},"python":null,"implementation":{"name":null,"version":null},"distro":{"name":"Ubuntu","version":"24.04","id":"noble","libc":null},"system":{"name":null,"release":null},"cpu":null,"openssl_version":null,"setuptools_version":null,"rustc_version":null,"ci":true}
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
68f20fd6c58c31a3a274e65f372f8b3e79d0e304838dff523389f40c826e236a
|
|
| MD5 |
971d25cdd11749a25d4085715452de69
|
|
| BLAKE2b-256 |
dbeae4a7e35c1e66d03d64eb7ed00aebcfb220ed498709f7b7f08585f4873357
|