Skip to main content

OpenInference Haystack Instrumentation

Project description

OpenInference Haystack Instrumentation

Python auto-instrumentation library for LLM applications implemented with Haystack.

Haystack Pipelines and Components (ex. PromptBuilder, OpenAIGenerator, etc.) are fully OpenTelemetry-compatible and can be sent to an OpenTelemetry collector for monitoring, such as arize-phoenix.

Installation

pip install openinference-instrumentation-haystack

Quickstart

This quickstart shows you how to instrument your Haystack-orchestrated LLM application

Through your terminal, install required packages.

pip install openinference-instrumentation-haystack haystack-ai arize-phoenix opentelemetry-sdk opentelemetry-exporter-otlp

You can install Phoenix and start it with the following terminal commands:

pip install arize-phoenix
python -m phoenix.server.main serve

Start Phoenix in the background as a collector. By default, it listens on http://localhost:6006. You can visit the app via a browser at the same address. (Phoenix does not send data over the internet. It only operates locally on your machine.)

Try the following in a Python file.

Set up HaystackInstrumentor to trace your application and sends the traces to Phoenix at the endpoint defined below.

from openinference.instrumentation.haystack import HaystackInstrumentor
from opentelemetry import trace as trace_api
from opentelemetry.exporter.otlp.proto.http.trace_exporter import OTLPSpanExporter
from opentelemetry.sdk import trace as trace_sdk
from opentelemetry.sdk.trace.export import SimpleSpanProcessor
import os

# Set your OpenAI API key
os.environ["OPENAI_API_KEY"] = "YOUR_KEY_HERE"

# Set up the tracer, using Arize Phoenix as the endpoint
endpoint = "http://127.0.0.1:6006/v1/traces"
tracer_provider = trace_sdk.TracerProvider()
trace_api.set_tracer_provider(tracer_provider)
tracer_provider.add_span_processor(SimpleSpanProcessor(OTLPSpanExporter(endpoint)))

# Instrument the Haystack application
HaystackInstrumentor().instrument()

Set up a simple Pipeline with a template using OpenAIGenerator.

from haystack import Pipeline
from haystack.components.generators import OpenAIGenerator

# Initialize the pipeline
pipeline = Pipeline()

# Initialize the OpenAI generator component
llm = OpenAIGenerator(model="gpt-3.5-turbo")

# Add the generator component to the pipeline
pipeline.add_component("llm", llm)

# Define the question
question = "What is the location of the Hanging Gardens of Babylon?"

# Run the pipeline with the question
response = pipeline.run({"llm": {"prompt": question}})

print(response)

Now, on the Phoenix UI on your browser, you should see the traces from your Haystack application. Specifically, you can see attributes from the execution of the OpenAIGenerator.

More Info

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

File details

Details for the file openinference_instrumentation_haystack-0.1.31.tar.gz.

File metadata

File hashes

Hashes for openinference_instrumentation_haystack-0.1.31.tar.gz
Algorithm Hash digest
SHA256 16f7e2e5d7d2e1825c90eb43e1922bd44e7c8b4bfb3d52e36840e8a45c3fda3f
MD5 902edffa03a1e5cc0d521ddc1113ba5d
BLAKE2b-256 4b1b65c234455db4c6306d91796f32bcdc331c108813d45044965e02fe210c25

See more details on using hashes here.

File details

Details for the file openinference_instrumentation_haystack-0.1.31-py3-none-any.whl.

File metadata

File hashes

Hashes for openinference_instrumentation_haystack-0.1.31-py3-none-any.whl
Algorithm Hash digest
SHA256 5c5d6eb1a0f5d96ca861f6f8a6030297f582c1aa171d45c4b0cf01e0acf55754
MD5 8c6560e60b62a6a63363511d1064120d
BLAKE2b-256 8c561d8511022a8f6192282a777313af4432fb58f46f54abff653814830623cc

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page