TraceAI instrumentation for IBM BeeAI Framework
Project description
TraceAI BeeAI Framework Integration
Comprehensive observability for IBM BeeAI Framework with TraceAI.
BeeAI Framework is an open-source toolkit from IBM Research for building production-grade multi-agent systems. This integration provides seamless tracing by leveraging BeeAI's built-in OpenInference instrumentation.
Features
- Zero-config integration: Works with BeeAI's native OpenInference support
- Automatic tracing: Agent runs, tool calls, LLM interactions, and workflows
- Token tracking: Input/output tokens and usage metrics
- Session correlation: Link traces across conversations
- Custom middleware: Extended event capture for detailed observability
- IBM Granite support: Optimized for IBM Granite and Llama models
Installation
pip install traceai-beeai
For full functionality with BeeAI:
pip install beeai-framework openinference-instrumentation-beeai
Quick Start
IMPORTANT: Call configure_beeai_tracing() BEFORE importing BeeAI modules!
Option 1: Using TraceAI with fi_instrumentation
# Setup TraceAI FIRST
from fi_instrumentation import register
from fi_instrumentation.fi_types import ProjectType
from traceai_beeai import configure_beeai_tracing
trace_provider = register(
project_type=ProjectType.OBSERVE,
project_name="my-beeai-agent",
)
configure_beeai_tracing(tracer_provider=trace_provider)
# NOW import BeeAI modules
from beeai_framework.agents import Agent
from beeai_framework.backend.chat import ChatModel
model = ChatModel.from_name("ollama:granite3.1-dense:8b")
agent = Agent(
llm=model,
role="Assistant",
instructions="You are a helpful assistant.",
)
response = agent.run("Hello!")
Option 2: Direct OTLP Configuration
from traceai_beeai import configure_beeai_tracing
# Configure with OTLP endpoint directly
configure_beeai_tracing(
otlp_endpoint="https://api.traceai.com/v1/traces",
otlp_headers={"Authorization": "Bearer YOUR_API_KEY"},
project_name="my-beeai-agent",
)
# Now import and use BeeAI
from beeai_framework.agents import Agent
# ...
Using Tools
BeeAI tools are automatically traced:
from beeai_framework.agents import Agent
from beeai_framework.backend.chat import ChatModel
from beeai_framework.tools import WikipediaTool, OpenMeteoTool, ThinkTool
model = ChatModel.from_name("ollama:granite3.1-dense:8b")
agent = Agent(
llm=model,
role="Research Assistant",
instructions="Use tools to find information.",
tools=[
ThinkTool(), # Internal reasoning
WikipediaTool(), # Knowledge retrieval
OpenMeteoTool(), # Weather forecasts
],
)
response = agent.run("What's the weather in Paris?")
Custom Middleware
For extended tracing with session tracking:
from traceai_beeai import create_tracing_middleware
middleware = create_tracing_middleware(
tracer_provider=trace_provider,
capture_input=True,
capture_output=True,
session_id="user-session-123",
user_id="user@example.com",
)
agent = Agent(
llm=model,
role="Assistant",
instructions="You are helpful.",
middlewares=[middleware],
)
Using Requirements
BeeAI agents with requirements are fully traced:
from beeai_framework.agents import Agent
from beeai_framework.requirements import ConditionalRequirement
from beeai_framework.tools import ThinkTool
agent = Agent(
llm=model,
role="Safe Assistant",
instructions="Always think before responding.",
tools=[ThinkTool()],
requirements=[
ConditionalRequirement(
step=0,
tool=ThinkTool, # Force ThinkTool on first step
),
],
)
Semantic Attributes
The integration captures these OpenTelemetry GenAI semantic attributes:
Agent Attributes
| Attribute | Description |
|---|---|
agent.name |
Agent name |
agent.type |
Agent class name |
agent.role |
Agent role description |
agent.instructions |
Agent instructions (truncated) |
beeai.tool_count |
Number of tools available |
beeai.requirements |
Configured requirements |
beeai.memory.type |
Memory strategy type |
Model Attributes
| Attribute | Description |
|---|---|
gen_ai.system |
Model provider (ibm, openai, etc.) |
gen_ai.request.model |
Model name/ID |
gen_ai.request.temperature |
Temperature setting |
gen_ai.request.max_tokens |
Max tokens setting |
gen_ai.usage.input_tokens |
Input token count |
gen_ai.usage.output_tokens |
Output token count |
Tool Attributes
| Attribute | Description |
|---|---|
gen_ai.tool.name |
Tool name |
gen_ai.tool.description |
Tool description |
gen_ai.tool.parameters |
Tool input parameters |
gen_ai.tool.result |
Tool execution result |
Workflow Attributes
| Attribute | Description |
|---|---|
beeai.workflow.name |
Workflow name |
beeai.workflow.step |
Current step name |
beeai.session.id |
Session identifier |
beeai.user.id |
User identifier |
Model Provider Support
The integration automatically detects model providers:
| Provider | Model Patterns |
|---|---|
| IBM | granite-*, ibm/* |
| OpenAI | gpt-*, o1-* |
| Anthropic | claude-* |
| Meta | llama-* |
gemini-* |
|
| Mistral | mistral-*, mixtral-* |
| Ollama | ollama/* |
| Groq | groq/* |
| Together | together/* |
Environment Variables
BeeAI respects standard OpenTelemetry environment variables:
# OTLP endpoint
export OTEL_EXPORTER_OTLP_TRACES_ENDPOINT="https://api.traceai.com/v1/traces"
# OTLP headers (authentication)
export OTEL_EXPORTER_OTLP_HEADERS="Authorization=Bearer YOUR_API_KEY"
Examples
See the examples/ directory for complete examples:
basic_agent.py- Simple agent with tracingagent_with_tools.py- Tools and function callingrequirements_agent.py- Behavioral requirementsmiddleware_tracing.py- Custom middleware with session tracking
How It Works
BeeAI has native observability support via OpenInference instrumentation. This integration:
- Configures OpenTelemetry: Sets up tracer provider and OTLP export
- Initializes BeeAIInstrumentor: Wraps BeeAI internals for automatic span creation
- Provides middleware: Optional
TraceAIMiddlewarefor extended capture - Adds helper functions: Attribute extraction and trace context management
Compatibility
- BeeAI Framework: >= 0.1.0
- Python: >= 3.11
- OpenTelemetry: >= 1.0.0
- OpenInference: >= 0.1.0
Resources
- BeeAI Framework Documentation
- BeeAI GitHub
- IBM Research - BeeAI
- OpenTelemetry GenAI Conventions
- TraceAI Documentation
License
Apache-2.0
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file traceai_beeai-0.1.0.tar.gz.
File metadata
- Download URL: traceai_beeai-0.1.0.tar.gz
- Upload date:
- Size: 16.7 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: uv/0.9.21 {"installer":{"name":"uv","version":"0.9.21","subcommand":["publish"]},"python":null,"implementation":{"name":null,"version":null},"distro":{"name":"macOS","version":null,"id":null,"libc":null},"system":{"name":null,"release":null},"cpu":null,"openssl_version":null,"setuptools_version":null,"rustc_version":null,"ci":null}
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
4d4a4cc5d6f53a967f2b2b2a3e28cc75f399fa1911bbd24fcd81f8e72be3e1c5
|
|
| MD5 |
3b4c517b94e4dbaf41ed5b5a8d242c07
|
|
| BLAKE2b-256 |
08de01535b288c0d9be53aa5aed9e6e39afb75bb69851e09eb9eac7403d7722a
|
File details
Details for the file traceai_beeai-0.1.0-py3-none-any.whl.
File metadata
- Download URL: traceai_beeai-0.1.0-py3-none-any.whl
- Upload date:
- Size: 12.1 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: uv/0.9.21 {"installer":{"name":"uv","version":"0.9.21","subcommand":["publish"]},"python":null,"implementation":{"name":null,"version":null},"distro":{"name":"macOS","version":null,"id":null,"libc":null},"system":{"name":null,"release":null},"cpu":null,"openssl_version":null,"setuptools_version":null,"rustc_version":null,"ci":null}
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
c65059cd4a5080c13636bb7a3c489f7a3dd0e691f61c09b5449bb7788f8b8e24
|
|
| MD5 |
9b7682f24899af5c15a73565e793a733
|
|
| BLAKE2b-256 |
ecef2ed5a604890205f5833af4d532647bb9f83bcdb1f9dd92cffab773a82075
|