Skip to main content

OpenTelemetry-based observability for Cloudbase Agent Python SDK with OpenInference semantic conventions

Project description

Cloudbase Agent Python Observability

OpenTelemetry-based observability for Cloudbase Agent Python SDK with OpenInference semantic conventions.

✨ Key Features

  • 🔌 Zero Configuration - Automatic stdout export via AUTO_TRACES_STDOUT=true
  • 🎯 OpenInference Compatible - Follows OpenInference semantic conventions for AI/LLM tracing
  • 🔗 Full Span Hierarchy - Captures Cloudbase Agent.Server → Adapter → Framework → LLM/Tool spans
  • 🚀 Built-in Configuration API - Simple ConsoleTraceConfig() / OTLPTraceConfig() setup
  • 📊 OTLP Export - Works with Langfuse, Jaeger, and any OTLP-compatible backend
  • Hybrid Mode - Combine ENV console export with parameter OTLP export

🚀 Quick Start

Option 1: Built-in Configuration (Recommended)

from cloudbase_agent.server import AgentServiceApp
from cloudbase_agent.observability.server import ConsoleTraceConfig

app = AgentServiceApp(observability=ConsoleTraceConfig())
app.run(create_agent, port=8000)

Option 2: AUTO_TRACES_STDOUT (Zero Code)

export AUTO_TRACES_STDOUT=true
python your_agent.py

Option 3: OTLP Export (Langfuse, etc.)

from cloudbase_agent.server import AgentServiceApp
from cloudbase_agent.observability.server import OTLPTraceConfig

app = AgentServiceApp(
    observability=OTLPTraceConfig(
        url="https://cloud.langfuse.com/api/public/otlp/v1/traces",
        headers={"Authorization": "Basic <credentials>"}
    )
)
app.run(create_agent, port=8000)

Option 4: Hybrid (ENV Console + Parameter OTLP)

import os
os.environ["AUTO_TRACES_STDOUT"] = "true"  # Console from ENV

from cloudbase_agent.server import AgentServiceApp
from cloudbase_agent.observability.server import OTLPTraceConfig

app = AgentServiceApp(
    observability=OTLPTraceConfig(url="...")
)
# Both console (from ENV) and OTLP (from param) receive traces!
app.run(create_agent, port=8000)

📦 Installation

# Install observability package
pip install cloudbase-agent-observability

# Or with adapter
pip install cloudbase-agent-langgraph[observability]
pip install cloudbase-agent-crewai[observability]

🛠️ Configuration API

Configuration Types

Type Description
ConsoleTraceConfig() Export to stdout (JSON format)
OTLPTraceConfig(url=...) Export to OTLP backend
CustomTraceConfig(setup=fn) Custom setup function
BatchConfig() Batch processing options

ConsoleTraceConfig

from cloudbase_agent.observability.server import ConsoleTraceConfig, BatchConfig

# Default console export
config = ConsoleTraceConfig()

# With custom batch settings
config = ConsoleTraceConfig(
    batch=BatchConfig(max_export_batch_size=200)
)

OTLPTraceConfig

from cloudbase_agent.observability.server import OTLPTraceConfig

config = OTLPTraceConfig(
    url="https://cloud.langfuse.com/api/public/otlp/v1/traces",
    headers={"Authorization": "Basic <credentials>"},
    timeout=10000,
)

CustomTraceConfig

from cloudbase_agent.observability.server import CustomTraceConfig

def my_setup():
    # Your custom OTel setup
    ...

config = CustomTraceConfig(setup=my_setup)

📋 Usage Examples

LangGraph

from cloudbase_agent.server import AgentServiceApp
from cloudbase_agent.langgraph import LangGraphAgent
from cloudbase_agent.observability.server import ConsoleTraceConfig

app = AgentServiceApp(observability=ConsoleTraceConfig())
app.run(lambda: {"agent": LangGraphAgent(graph=workflow.compile())}, port=8000)

CrewAI

import os
os.environ['CREWAI_DISABLE_TELEMETRY'] = 'true'  # Before CrewAI imports!

from cloudbase_agent.server import AgentServiceApp
from cloudbase_agent.crewai import CrewAIAgent
from cloudbase_agent.observability.server import ConsoleTraceConfig

app = AgentServiceApp(observability=ConsoleTraceConfig())
app.run(create_agent, port=8000)

🔧 Advanced: Manual TracerProvider

For advanced use cases requiring custom TracerProvider:

from opentelemetry import trace
from opentelemetry.sdk.trace import TracerProvider
from opentelemetry.sdk.trace.export import ConsoleSpanExporter, BatchSpanProcessor
from opentelemetry.sdk.resources import Resource

resource = Resource.create({"service.name": "my-app"})
provider = TracerProvider(resource=resource)
provider.add_span_processor(BatchSpanProcessor(ConsoleSpanExporter()))
trace.set_tracer_provider(provider)

app = AgentServiceApp()  # Will use your provider
app.run(create_agent, port=8000)

Note: This bypasses the built-in configuration API. Use only when necessary.


📊 Span Hierarchy Example

Cloudbase Agent.Server
  └─ Adapter.LangGraph
      └─ LangGraph
          ├─ node_a
          │   └─ ChatOpenAI
          └─ node_b
              └─ ChatOpenAI

🆚 Built-in vs Manual

Aspect Built-in API Manual Setup
Lines of code 1-2 20+
Type safety
Batch defaults ✅ Optimized ❌ Manual
Multiple exporters ✅ Array ❌ Complex
ENV support ✅ AUTO_TRACES_STDOUT

Migration:

# Before (Manual)
from opentelemetry import trace
from opentelemetry.sdk.trace import TracerProvider
from opentelemetry.sdk.trace.export import ConsoleSpanExporter, BatchSpanProcessor
from opentelemetry.sdk.resources import Resource

resource = Resource.create({"service.name": "my-app"})
provider = TracerProvider(resource=resource)
provider.add_span_processor(BatchSpanProcessor(ConsoleSpanExporter()))
trace.set_tracer_provider(provider)
app = AgentServiceApp()

# After (Built-in)
from cloudbase_agent.server import AgentServiceApp
from cloudbase_agent.observability.server import ConsoleTraceConfig

app = AgentServiceApp(observability=ConsoleTraceConfig())

🧪 Testing

# Run built-in config test
python examples/observability/langgraph_simple.py

# Parse span hierarchy
python examples/observability/langgraph_simple.py 2>&1 | \
    python examples/observability/parse_console_spans.py

📝 License

MIT

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

cloudbase_agent_observability-0.1.0.tar.gz (28.1 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

cloudbase_agent_observability-0.1.0-py3-none-any.whl (38.9 kB view details)

Uploaded Python 3

File details

Details for the file cloudbase_agent_observability-0.1.0.tar.gz.

File metadata

File hashes

Hashes for cloudbase_agent_observability-0.1.0.tar.gz
Algorithm Hash digest
SHA256 55093142b36ca1d299e746f4a916e230198b1abc94f83df4c1718184975cf633
MD5 5a4aeaeb75b3855a031f0bad8f3d010c
BLAKE2b-256 5fcc2fe5686b0d47255b8dfd22a5d23f26cedba61030a9b70ae175121ccba8b6

See more details on using hashes here.

File details

Details for the file cloudbase_agent_observability-0.1.0-py3-none-any.whl.

File metadata

File hashes

Hashes for cloudbase_agent_observability-0.1.0-py3-none-any.whl
Algorithm Hash digest
SHA256 d9d9071d130096279aeddbf50376c321523d74d6260a4af9e418219264b0159e
MD5 c54c57d2683a905ecc37ca51ea27e532
BLAKE2b-256 0e31f2eb605aafcc80fed887a234913b005b1e9cee25ef36c9ba06a637d25307

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page