Skip to main content

OpenInference OpenLIT Instrumentation

Project description

OpenInference OpenLit Instrumentation

Python auto-instrumentation library for OpenLIT. This library allows you to convert OpenLIT traces to OpenInference, which is OpenTelemetry compatible, and view those traces in Arize Phoenix.

Installation

pip install openinference-instrumentation-openlit

Quickstart

This quickstart shows you how to view your OpenLIT traces in Phoenix.

Install required packages.

pip install arize-phoenix opentelemetry-sdk opentelemetry-exporter-otlp openlit semantic-kernel

Start Phoenix in the background as a collector. By default, it listens on http://localhost:6006. You can visit the app via a browser at the same address.

phoenix serve

Here's a simple example that demonstrates how to convert OpenLIT traces into OpenInference and view those traces in Phoenix:

import os
import grpc
from opentelemetry.sdk.trace.export import BatchSpanProcessor
from opentelemetry.exporter.otlp.proto.grpc.trace_exporter import OTLPSpanExporter
from phoenix.otel import register
from openinference.instrumentation.openlit import OpenInferenceSpanProcessor
from semantic_kernel import Kernel
from semantic_kernel.connectors.ai.open_ai import OpenAIChatCompletion
import openlit

# Set your OpenAI API key
os.environ["OPENAI_API_KEY"] = "YOUR_OPENAI_API_KEY"

# Set up the tracer provider
tracer_provider = register(
    project_name="default" #Phoenix project name
)

tracer_provider.add_span_processor(OpenInferenceSpanProcessor())
    
tracer_provider.add_span_processor(
    BatchSpanProcessor(
        OTLPSpanExporter(
            endpoint="http://localhost:4317", #if using phoenix cloud, change to phoenix cloud endpoint (phoenix cloud space -> settings -> endpoint/hostname)
            headers={},
            compression=grpc.Compression.Gzip,  # use enum instead of string
        )
    )
)

# Initialize OpenLit tracer
tracer = tracer_provider.get_tracer(__name__)
openlit.init(tracer=tracer)

# Set up Semantic Kernel with OpenLIT
kernel = Kernel()
kernel.add_service(
    OpenAIChatCompletion(
        service_id="default",
        ai_model_id="gpt-4o-mini",
    ),
)

# Define and invoke your model
result = await kernel.invoke_prompt(
    prompt="What is the national food of Yemen?",
    arguments={},
)

# Now view your converted OpenLIT traces in Phoenix!

This example:

  1. Uses OpenLIT Instrumentor to instrument the application.
  2. Defines a simple Semantic Kernel model and runs a query
  3. Queries are exported to Phoenix using a span processor.

The traces will be visible in the Phoenix UI at http://localhost:6006.

More Info

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

openinference_instrumentation_openlit-0.1.2.tar.gz (11.6 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

File details

Details for the file openinference_instrumentation_openlit-0.1.2.tar.gz.

File metadata

File hashes

Hashes for openinference_instrumentation_openlit-0.1.2.tar.gz
Algorithm Hash digest
SHA256 c8f31d743bc719230826f737052a2643e45d2a3541577d35524656df7a4e0421
MD5 7d5b5d8defd972e04f3081807d0cdf04
BLAKE2b-256 218547c9e1747e0f343cd818a92d24fea0976a367ea83bb437193bd0e4f58af8

See more details on using hashes here.

File details

Details for the file openinference_instrumentation_openlit-0.1.2-py3-none-any.whl.

File metadata

File hashes

Hashes for openinference_instrumentation_openlit-0.1.2-py3-none-any.whl
Algorithm Hash digest
SHA256 187cb3ad3830e9f65ea7ae3d4e324e8899194391b796c49dd573a8f98cc89146
MD5 afafa24a7ad95378e45e2b36ed69d396
BLAKE2b-256 39005132907f0b0e6ebfa1e2015532ac2df44a460664cb232e8fdd9817c3d632

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page