Skip to main content

OpenTelemetry-based observability for Cloudbase Agent Python SDK with OpenInference semantic conventions

Project description

Cloudbase Agent Python Observability

OpenTelemetry-based observability for Cloudbase Agent Python SDK with OpenInference semantic conventions.

✨ Key Features

  • 🔌 Zero Configuration - Automatic stdout export via AUTO_TRACES_STDOUT=true
  • 🎯 OpenInference Compatible - Follows OpenInference semantic conventions for AI/LLM tracing
  • 🔗 Full Span Hierarchy - Captures Cloudbase Agent.Server → Adapter → Framework → LLM/Tool spans
  • 🚀 Built-in Configuration API - Simple ConsoleTraceConfig() / OTLPTraceConfig() setup
  • 📊 OTLP Export - Works with Langfuse, Jaeger, and any OTLP-compatible backend
  • Hybrid Mode - Combine ENV console export with parameter OTLP export

🚀 Quick Start

Option 1: Built-in Configuration (Recommended)

from cloudbase_agent.server import AgentServiceApp
from cloudbase_agent.observability.server import ConsoleTraceConfig

app = AgentServiceApp(observability=ConsoleTraceConfig())
app.run(create_agent, port=8000)

Option 2: AUTO_TRACES_STDOUT (Zero Code)

export AUTO_TRACES_STDOUT=true
python your_agent.py

Option 3: OTLP Export (Langfuse, etc.)

from cloudbase_agent.server import AgentServiceApp
from cloudbase_agent.observability.server import OTLPTraceConfig

app = AgentServiceApp(
    observability=OTLPTraceConfig(
        url="https://cloud.langfuse.com/api/public/otlp/v1/traces",
        headers={"Authorization": "Basic <credentials>"}
    )
)
app.run(create_agent, port=8000)

Option 4: Hybrid (ENV Console + Parameter OTLP)

import os
os.environ["AUTO_TRACES_STDOUT"] = "true"  # Console from ENV

from cloudbase_agent.server import AgentServiceApp
from cloudbase_agent.observability.server import OTLPTraceConfig

app = AgentServiceApp(
    observability=OTLPTraceConfig(url="...")
)
# Both console (from ENV) and OTLP (from param) receive traces!
app.run(create_agent, port=8000)

📦 Installation

# Install observability package
pip install cloudbase-agent-observability

# Or with adapter
pip install cloudbase-agent-langgraph[observability]
pip install cloudbase-agent-crewai[observability]

🛠️ Configuration API

Configuration Types

Type Description
ConsoleTraceConfig() Export to stdout (JSON format)
OTLPTraceConfig(url=...) Export to OTLP backend
CustomTraceConfig(setup=fn) Custom setup function
BatchConfig() Batch processing options

ConsoleTraceConfig

from cloudbase_agent.observability.server import ConsoleTraceConfig, BatchConfig

# Default console export
config = ConsoleTraceConfig()

# With custom batch settings
config = ConsoleTraceConfig(
    batch=BatchConfig(max_export_batch_size=200)
)

OTLPTraceConfig

from cloudbase_agent.observability.server import OTLPTraceConfig

config = OTLPTraceConfig(
    url="https://cloud.langfuse.com/api/public/otlp/v1/traces",
    headers={"Authorization": "Basic <credentials>"},
    timeout=10000,
)

CustomTraceConfig

from cloudbase_agent.observability.server import CustomTraceConfig

def my_setup():
    # Your custom OTel setup
    ...

config = CustomTraceConfig(setup=my_setup)

📋 Usage Examples

LangGraph

from cloudbase_agent.server import AgentServiceApp
from cloudbase_agent.langgraph import LangGraphAgent
from cloudbase_agent.observability.server import ConsoleTraceConfig

app = AgentServiceApp(observability=ConsoleTraceConfig())
app.run(lambda: {"agent": LangGraphAgent(graph=workflow.compile())}, port=8000)

CrewAI

import os
os.environ['CREWAI_DISABLE_TELEMETRY'] = 'true'  # Before CrewAI imports!

from cloudbase_agent.server import AgentServiceApp
from cloudbase_agent.crewai import CrewAIAgent
from cloudbase_agent.observability.server import ConsoleTraceConfig

app = AgentServiceApp(observability=ConsoleTraceConfig())
app.run(create_agent, port=8000)

🔧 Advanced: Manual TracerProvider

For advanced use cases requiring custom TracerProvider:

from opentelemetry import trace
from opentelemetry.sdk.trace import TracerProvider
from opentelemetry.sdk.trace.export import ConsoleSpanExporter, BatchSpanProcessor
from opentelemetry.sdk.resources import Resource

resource = Resource.create({"service.name": "my-app"})
provider = TracerProvider(resource=resource)
provider.add_span_processor(BatchSpanProcessor(ConsoleSpanExporter()))
trace.set_tracer_provider(provider)

app = AgentServiceApp()  # Will use your provider
app.run(create_agent, port=8000)

Note: This bypasses the built-in configuration API. Use only when necessary.


📊 Span Hierarchy Example

Cloudbase Agent.Server
  └─ Adapter.LangGraph
      └─ LangGraph
          ├─ node_a
          │   └─ ChatOpenAI
          └─ node_b
              └─ ChatOpenAI

🆚 Built-in vs Manual

Aspect Built-in API Manual Setup
Lines of code 1-2 20+
Type safety
Batch defaults ✅ Optimized ❌ Manual
Multiple exporters ✅ Array ❌ Complex
ENV support ✅ AUTO_TRACES_STDOUT

Migration:

# Before (Manual)
from opentelemetry import trace
from opentelemetry.sdk.trace import TracerProvider
from opentelemetry.sdk.trace.export import ConsoleSpanExporter, BatchSpanProcessor
from opentelemetry.sdk.resources import Resource

resource = Resource.create({"service.name": "my-app"})
provider = TracerProvider(resource=resource)
provider.add_span_processor(BatchSpanProcessor(ConsoleSpanExporter()))
trace.set_tracer_provider(provider)
app = AgentServiceApp()

# After (Built-in)
from cloudbase_agent.server import AgentServiceApp
from cloudbase_agent.observability.server import ConsoleTraceConfig

app = AgentServiceApp(observability=ConsoleTraceConfig())

🧪 Testing

# Run built-in config test
python examples/observability/langgraph_simple.py

# Parse span hierarchy
python examples/observability/langgraph_simple.py 2>&1 | \
    python examples/observability/parse_console_spans.py

📝 License

MIT

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

cloudbase_agent_observability-0.1.1.tar.gz (28.5 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

cloudbase_agent_observability-0.1.1-py3-none-any.whl (39.4 kB view details)

Uploaded Python 3

File details

Details for the file cloudbase_agent_observability-0.1.1.tar.gz.

File metadata

File hashes

Hashes for cloudbase_agent_observability-0.1.1.tar.gz
Algorithm Hash digest
SHA256 a994a43e1fb57cf255f7acdbe13a559c7d0eb7a304e99450b934d12451fd7a1d
MD5 1613105deacf53210ebc8ca4990b1659
BLAKE2b-256 704847e40d6b65f0379f38527893a6fc4d2ba47ba857709c84c1884d1a8df6fe

See more details on using hashes here.

File details

Details for the file cloudbase_agent_observability-0.1.1-py3-none-any.whl.

File metadata

File hashes

Hashes for cloudbase_agent_observability-0.1.1-py3-none-any.whl
Algorithm Hash digest
SHA256 2643f6e8ec33fc15c5efd2a509945248f71d5d07ceab9dc7b086a88bb8b99fb6
MD5 f82ddf49180142feb6bd3844a52f93a6
BLAKE2b-256 6f920e24e5640a169fd80c6c5ab432b2dda60e24358bf7a798e55e3d9f16edd4

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page