Skip to main content

OpenInference OpenLLMetry Instrumentation

Project description

OpenInference OpenLLMetry (Traceloop)

Python auto-instrumentation library for OpenLLMetry. This library allows you to convert OpenLLMetry traces to OpenInference, which is OpenTelemetry compatible, and view those traces in Arize Phoenix.

Installation

pip install openinference-instrumentation-openllmetry

Quickstart

This quickstart shows you how to view your OpenLLMetry traces in Phoenix.

Install required packages.

pip install arize-phoenix opentelemetry-sdk opentelemetry-exporter-otlp opentelemetry-instrumentation-openai

Start Phoenix in the background as a collector. By default, it listens on http://localhost:6006. You can visit the app via a browser at the same address. (Phoenix does not send data over the internet. It only operates locally on your machine.)

phoenix serve

Here's a simple example that demonstrates how to view convert OpenLLMetry traces into OpenInference and view those traces in Phoenix:

import os
import grpc
import openai
from opentelemetry.sdk.trace.export import BatchSpanProcessor
from opentelemetry.exporter.otlp.proto.grpc.trace_exporter import OTLPSpanExporter
from phoenix.otel import register
from openinference.instrumentation.openllmetry import OpenInferenceSpanProcessor
from opentelemetry.instrumentation.openai import OpenAIInstrumentor

# Set your OpenAI API key
os.environ["OPENAI_API_KEY"] = "YOUR_OPENAI_API_KEY"

# Set up the tracer provider
tracer_provider = register(
    project_name="default" #Phoenix project name
)

tracer_provider.add_span_processor(OpenInferenceSpanProcessor())
    
tracer_provider.add_span_processor(
    BatchSpanProcessor(
        OTLPSpanExporter(
            endpoint="http://localhost:4317", #if using phoenix cloud, change to phoenix cloud endpoint (phoenix cloud space -> settings -> endpoint/hostname)
            headers={},
            compression=grpc.Compression.Gzip,  # use enum instead of string
        )
    )
)


OpenAIInstrumentor().instrument(tracer_provider=tracer_provider)

# Define and invoke your OpenAI model
client = openai.OpenAI()

messages = [
        {"role": "user", "content": "What is the national food of Yemen?"}
    ]

response = client.chat.completions.create(
    model="gpt-4",
    messages=messages,
)

# Now view your converted OpenLLMetry traces in Phoenix!

This example:

  1. Uses OpenLLMetry Instrumentor to instrument the application.
  2. Defines a simple OpenAI model and runs a query
  3. Queries are exported to Phoenix using a span processor.

The traces will be visible in the Phoenix UI at http://localhost:6006.

More Info

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

File details

Details for the file openinference_instrumentation_openllmetry-0.1.3.tar.gz.

File metadata

File hashes

Hashes for openinference_instrumentation_openllmetry-0.1.3.tar.gz
Algorithm Hash digest
SHA256 3fda84024c477bec89fc99296ff0b0f6d820e8352d9f5759798dc33f68bb12cc
MD5 fce656f745fcfa4cca6c39506943d82a
BLAKE2b-256 972e3f17d87aa42c66fecb69adb381500bf26219e1b2b9480ef89193420c93b5

See more details on using hashes here.

File details

Details for the file openinference_instrumentation_openllmetry-0.1.3-py3-none-any.whl.

File metadata

File hashes

Hashes for openinference_instrumentation_openllmetry-0.1.3-py3-none-any.whl
Algorithm Hash digest
SHA256 61f1580591f65f7774e592cab4eb974a3452bad70dbaa959c772dfd86601848e
MD5 4bb8f40026b26d3a75250080fecec493
BLAKE2b-256 9ddbfeb258dbd3e99097b18de6ffe4d59615f85086dfcc115ff85c09dee7981e

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page