TraceAI instrumentation for AWS Strands Agents
Project description
TraceAI Strands Agents Integration
Comprehensive observability for AWS Strands Agents with TraceAI.
Strands Agents is an open-source SDK from AWS that enables building AI agents with a model-driven approach. This integration provides seamless tracing by leveraging Strands' built-in OpenTelemetry support.
Features
- Zero-config integration: Works with Strands' native OTEL support
- Automatic tracing: Agent invocations, tool calls, and model interactions
- Token tracking: Input/output tokens and cache metrics (Bedrock)
- Session correlation: Link traces across conversations
- Custom callbacks: Extended event capture for detailed observability
- MCP support: Trace Model Context Protocol tool usage
Installation
pip install traceai-strands
For full functionality with Strands:
pip install 'strands-agents[otel]'
Quick Start
Option 1: Using TraceAI with fi_instrumentation
from fi_instrumentation import register
from fi_instrumentation.fi_types import ProjectType
from traceai_strands import configure_strands_tracing
# Setup TraceAI
trace_provider = register(
project_type=ProjectType.OBSERVE,
project_name="my-strands-agent",
)
# Configure Strands to use TraceAI
configure_strands_tracing(tracer_provider=trace_provider)
# Now use Strands normally - traces are sent automatically
from strands import Agent
agent = Agent(
model="us.anthropic.claude-sonnet-4-20250514-v1:0",
system_prompt="You are a helpful assistant.",
)
response = agent("Hello!")
Option 2: Direct OTLP Configuration
from traceai_strands import configure_strands_tracing
# Configure with OTLP endpoint directly
configure_strands_tracing(
otlp_endpoint="https://api.traceai.com/v1/traces",
otlp_headers={"Authorization": "Bearer YOUR_API_KEY"},
project_name="my-strands-agent",
)
from strands import Agent
agent = Agent(
model="us.anthropic.claude-sonnet-4-20250514-v1:0",
system_prompt="You are a helpful assistant.",
)
response = agent("Hello!")
Adding Trace Attributes
Add session and user information for better trace correlation:
from strands import Agent
agent = Agent(
model="us.anthropic.claude-sonnet-4-20250514-v1:0",
system_prompt="You are a helpful assistant.",
trace_attributes={
"session.id": "user-session-123",
"user.id": "user@example.com",
"tags": ["production", "chatbot"],
},
)
Or use the helper function:
from traceai_strands import create_traced_agent
agent = create_traced_agent(
model="us.anthropic.claude-sonnet-4-20250514-v1:0",
system_prompt="You are a helpful assistant.",
session_id="user-session-123",
user_id="user@example.com",
tags=["production", "chatbot"],
)
Using Tools
Strands tools are automatically traced when using the @tool decorator:
from strands import Agent, tool
from typing import Annotated
@tool
def get_weather(city: Annotated[str, "City name"]) -> str:
"""Get the current weather for a city."""
return f"Weather in {city}: 72°F, Sunny"
@tool
def calculate(
operation: Annotated[str, "add, subtract, multiply, divide"],
a: Annotated[float, "First number"],
b: Annotated[float, "Second number"],
) -> float:
"""Perform a calculation."""
ops = {"add": lambda x, y: x + y, "multiply": lambda x, y: x * y}
return ops[operation](a, b)
agent = Agent(
model="us.anthropic.claude-sonnet-4-20250514-v1:0",
tools=[get_weather, calculate],
)
response = agent("What's 15 times 7, and what's the weather in Tokyo?")
Custom Callback Handler
For extended event capture beyond Strands' built-in telemetry:
from traceai_strands import StrandsCallbackHandler
# Create callback handler
callback = StrandsCallbackHandler(
tracer_provider=trace_provider,
capture_input=True,
capture_output=True,
)
# Use with agent
agent = Agent(
model="us.anthropic.claude-sonnet-4-20250514-v1:0",
callback_handler=callback,
)
# Lifecycle events are automatically traced
response = agent("Hello!")
MCP Integration
Trace Model Context Protocol server tools:
from strands import Agent
from strands.tools.mcp import MCPClient
# Connect to MCP server
mcp_client = MCPClient(
server_command=["npx", "@anthropic/mcp-server-calculator"],
)
# Get MCP tools
mcp_tools = mcp_client.list_tools_sync()
# Create agent with MCP tools
agent = Agent(
model="us.anthropic.claude-sonnet-4-20250514-v1:0",
tools=mcp_tools,
)
response = agent("Calculate the square root of 144")
Semantic Attributes
The integration captures these OpenTelemetry GenAI semantic attributes:
Agent Attributes
| Attribute | Description |
|---|---|
agent.type |
Agent class name |
strands.system_prompt |
Agent's system prompt (truncated) |
strands.tool_count |
Number of tools available |
strands.session.id |
Session identifier |
strands.user.id |
User identifier |
Model Attributes
| Attribute | Description |
|---|---|
gen_ai.system |
Model provider (bedrock, openai, etc.) |
gen_ai.request.model |
Model name/ID |
gen_ai.request.temperature |
Temperature setting |
gen_ai.request.max_tokens |
Max tokens setting |
gen_ai.usage.input_tokens |
Input token count |
gen_ai.usage.output_tokens |
Output token count |
strands.cache.read_tokens |
Cache read tokens (Bedrock) |
strands.cache.write_tokens |
Cache write tokens (Bedrock) |
Tool Attributes
| Attribute | Description |
|---|---|
gen_ai.tool.name |
Tool function name |
gen_ai.tool.description |
Tool docstring |
gen_ai.tool.parameters |
Tool input parameters |
gen_ai.tool.result |
Tool execution result |
Environment Variables
Strands respects standard OpenTelemetry environment variables:
# OTLP endpoint
export OTEL_EXPORTER_OTLP_ENDPOINT="https://api.traceai.com/v1/traces"
# OTLP headers (authentication)
export OTEL_EXPORTER_OTLP_HEADERS="Authorization=Bearer YOUR_API_KEY"
# Sampling (optional)
export OTEL_TRACES_SAMPLER="traceidratio"
export OTEL_TRACES_SAMPLER_ARG="0.5" # Sample 50%
Model Provider Support
The integration automatically detects model providers:
| Provider | Model Patterns |
|---|---|
| Bedrock | us.anthropic.*, us.amazon.*, eu.* |
| OpenAI | gpt-*, o1-*, text-davinci-* |
| Anthropic | claude-* |
gemini-*, palm-* |
|
| Mistral | mistral-*, mixtral-* |
| Meta | llama-* |
| Ollama | ollama/* |
Examples
See the examples/ directory for complete examples:
basic_agent.py- Simple agent with tracingagent_with_tools.py- Tools and function callingcallback_handler.py- Custom callback handlermcp_agent.py- MCP server integration
How It Works
Strands has native OpenTelemetry support through its StrandsTelemetry class. This integration:
- Configures OTLP export: Sets environment variables or configures
StrandsTelemetryto send traces to TraceAI - Adds helper functions: Provides
create_traced_agent()and attribute helpers - Optional callbacks:
StrandsCallbackHandlerfor extended event capture
The integration is lightweight because Strands already does the heavy lifting for telemetry.
Compatibility
- Strands Agents: >= 1.0.0
- Python: >= 3.10
- OpenTelemetry: >= 1.0.0
Resources
License
Apache-2.0
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file traceai_strands-0.1.0.tar.gz.
File metadata
- Download URL: traceai_strands-0.1.0.tar.gz
- Upload date:
- Size: 18.5 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: uv/0.9.21 {"installer":{"name":"uv","version":"0.9.21","subcommand":["publish"]},"python":null,"implementation":{"name":null,"version":null},"distro":{"name":"macOS","version":null,"id":null,"libc":null},"system":{"name":null,"release":null},"cpu":null,"openssl_version":null,"setuptools_version":null,"rustc_version":null,"ci":null}
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
8ac6dcb668a74baa13fcf306b76d97dafeaa05d04c97004b585158f7561e4346
|
|
| MD5 |
3d2403f071da829ee1ee4c7c7e66c61e
|
|
| BLAKE2b-256 |
f4b0de197a3f5ca970a1cd5b3f7003d01d7a76ca40392fa8505e39a7b45e0654
|
File details
Details for the file traceai_strands-0.1.0-py3-none-any.whl.
File metadata
- Download URL: traceai_strands-0.1.0-py3-none-any.whl
- Upload date:
- Size: 13.3 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: uv/0.9.21 {"installer":{"name":"uv","version":"0.9.21","subcommand":["publish"]},"python":null,"implementation":{"name":null,"version":null},"distro":{"name":"macOS","version":null,"id":null,"libc":null},"system":{"name":null,"release":null},"cpu":null,"openssl_version":null,"setuptools_version":null,"rustc_version":null,"ci":null}
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
497973177c0f2941211196c628c29f281504022be33ceb6dca6bfeb512661420
|
|
| MD5 |
cf6e0d2e54d6b6bd7e5027e0af5e166a
|
|
| BLAKE2b-256 |
6a4dc7f7b03a967c08291cf00d34412c5c793f03839ca0e258f30de916ab82db
|