Trace Haystack LLM and agent calls with Blue Guardrails.
Project description
Blue Guardrails - Haystack
The Blue Guardrails Haystack integration instruments LLM and agent calls in Haystack and sends them as OpenTelemetry traces to the Blue Guardrails platform.
Use Blue Guardrails to monitor your agents and other GenAI applications in production or evaluate pre-deployment.
The Blue Guardrails reliability layer runs on ingested traces and detects issues like hallucinations, poor instruction-following, or tool-calling issues.
More info on Blue Guardrails in the official documentation.
Features
| Feature | Description |
|---|---|
| Haystack tracing | Trace Haystack agents and direct LLM calls. |
| OpenTelemetry GenAI compatibility | Export traces that follow the OpenTelemetry GenAI semantic conventions. |
| Rich trace data | Capture inputs, outputs, tool calls, tool definitions, model parameters, and usage. |
| Sampling controls | Choose how much trace data to send with configurable sampling. |
| Reliability layer | Use the Blue Guardrails reliability layer to help prevent your agent from going off track. |
Install
Install the package with pip:
pip install blueguardrails-haystack
Or with uv:
uv add blueguardrails-haystack
The package supports Python 3.11 through 3.14.
Use
Trace an agent
To trace a Haystack agent, configure the Blue Guardrails tracer before you run the agent.
from haystack.components.agents import Agent
from haystack.components.generators.chat import OpenAIChatGenerator
from haystack.dataclasses import ChatMessage
from haystack.tools import Tool
from blueguardrails_haystack import configure_blueguardrails_tracer
def get_weather(city: str) -> str:
return f"The weather in {city} is sunny."
configure_blueguardrails_tracer(
name="support-agent",
tags={"environment": "development"},
)
weather_tool = Tool(
name="get_weather",
description="Returns the weather for a city.",
parameters={
"type": "object",
"properties": {"city": {"type": "string"}},
"required": ["city"],
},
function=get_weather,
)
agent = Agent(
chat_generator=OpenAIChatGenerator(model="gpt-4o-mini"),
tools=[weather_tool],
system_prompt="Use tools when they help answer the user.",
exit_conditions=["text"],
max_agent_steps=3,
)
result = agent.run(
messages=[ChatMessage.from_user("What is the weather in Berlin?")]
)
print(result["last_message"].text)
Blue Guardrails receives a trace for each generator call the agent makes.
configure_blueguardrails_tracer() reads BLUE_GUARDRAILS_API_KEY by default. If you don't set BLUE_GUARDRAILS_API_KEY, pass api_key explicitly. It raises ValueError if neither is set:
configure_blueguardrails_tracer(name="support-agent", api_key="your-api-key")
Trace a pipeline
Add BlueGuardrailsConnector to your pipeline. You don't need to connect it to other components.
from haystack import Pipeline
from haystack.components.generators.chat import OpenAIChatGenerator
from haystack.dataclasses import ChatMessage
from haystack.utils import Secret
from blueguardrails_haystack import BlueGuardrailsConnector
pipe = Pipeline()
pipe.add_component(
"blueguardrails",
BlueGuardrailsConnector(
name="support-bot",
api_key=Secret.from_env_var("BLUE_GUARDRAILS_API_KEY"),
tags={"environment": "development"},
),
)
pipe.add_component("llm", OpenAIChatGenerator(model="gpt-5.4-mini"))
result = pipe.run(
{
"llm": {
"messages": [
ChatMessage.from_user("Reply in one sentence. What is Haystack?")
]
}
}
)
print(result["llm"]["replies"][0].text)
When the pipeline runs, Blue Guardrails receives a trace for the llm call.
Load and trace a pipeline from YAML
Haystack pipelines can be serialized to and loaded from YAML. Put the connector in the YAML just like any other component; loading the pipeline initializes Blue Guardrails tracing.
pipeline.yaml:
components:
blueguardrails:
type: blueguardrails_haystack.components.connector.BlueGuardrailsConnector
init_parameters:
name: support-bot
api_key:
type: env_var
env_vars:
- BLUE_GUARDRAILS_API_KEY
strict: true
sample_rate: 1.0
tags:
environment: development
llm:
type: haystack.components.generators.chat.openai.OpenAIChatGenerator
init_parameters:
model: gpt-5.4-mini
api_key:
type: env_var
env_vars:
- OPENAI_API_KEY
strict: true
connections: []
Set BLUE_GUARDRAILS_API_KEY and OPENAI_API_KEY in your environment, then load and run it:
from pathlib import Path
from haystack import Pipeline
from haystack.dataclasses import ChatMessage
pipeline = Pipeline.loads(Path("pipeline.yaml").read_text())
result = pipeline.run(
{
"llm": {
"messages": [
ChatMessage.from_user("Reply in one sentence. What is Haystack?")
]
}
}
)
print(result["llm"]["replies"][0].text)
Use Pipeline.dumps() or Pipeline.dump() to serialize a Python-built pipeline back to YAML.
Configure the connector
BlueGuardrailsConnector(
name="production-rag",
api_key=Secret.from_env_var("BLUE_GUARDRAILS_API_KEY"),
sample_rate=0.1,
tags={"environment": "production", "team": "search"},
)
| Argument | Default | Description |
|---|---|---|
name |
Required | Trace name shown in Blue Guardrails. |
api_key |
Secret.from_env_var("BLUE_GUARDRAILS_API_KEY") |
API key for trace export. |
endpoint |
https://api.blueguardrails.com/v1/traces |
OpenTelemetry Protocol (OTLP) HTTP traces endpoint. |
sample_rate |
1.0 |
Fraction of generator calls to export. Use a value between 0.0 and 1.0. |
tags |
None |
Conversation tags attached to each exported generator span. |
Traces include generator inputs and outputs. Review your data handling requirements before you enable the connector in production.
License
Apache-2.0
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file blueguardrails_haystack-0.0.1.tar.gz.
File metadata
- Download URL: blueguardrails_haystack-0.0.1.tar.gz
- Upload date:
- Size: 30.0 kB
- Tags: Source
- Uploaded using Trusted Publishing? Yes
- Uploaded via: uv/0.11.8 {"installer":{"name":"uv","version":"0.11.8","subcommand":["publish"]},"python":null,"implementation":{"name":null,"version":null},"distro":{"name":"Ubuntu","version":"24.04","id":"noble","libc":null},"system":{"name":null,"release":null},"cpu":null,"openssl_version":null,"setuptools_version":null,"rustc_version":null,"ci":true}
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
87f39467106162c6c63ab3b14d7fd7c65736a053b7de0b40a78c57f4f2cbd2da
|
|
| MD5 |
e4c6b293ec10d9db1b46155ee2e3e590
|
|
| BLAKE2b-256 |
1585f661c947c8db3c5b4216c9b5a8d497dc7268dc46f03df5e827cb356f40a7
|
File details
Details for the file blueguardrails_haystack-0.0.1-py3-none-any.whl.
File metadata
- Download URL: blueguardrails_haystack-0.0.1-py3-none-any.whl
- Upload date:
- Size: 32.2 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? Yes
- Uploaded via: uv/0.11.8 {"installer":{"name":"uv","version":"0.11.8","subcommand":["publish"]},"python":null,"implementation":{"name":null,"version":null},"distro":{"name":"Ubuntu","version":"24.04","id":"noble","libc":null},"system":{"name":null,"release":null},"cpu":null,"openssl_version":null,"setuptools_version":null,"rustc_version":null,"ci":true}
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
7d1cc650b4adf73e6631d9dd5936ab7c2989ebcbb12214dd786290facdeb8d01
|
|
| MD5 |
2dd48915de2cbe3a8715775192f70308
|
|
| BLAKE2b-256 |
58a23ce6a37ef0a22fe773f49f3bc62d8dada955e87ae7eb051b27768df3c543
|