Skip to main content

OpenInference OpenLIT Instrumentation

Project description

OpenInference OpenLit Instrumentation

Python auto-instrumentation library for OpenLIT. This library allows you to convert OpenLIT traces to OpenInference, which is OpenTelemetry compatible, and view those traces in Arize Phoenix.

Installation

pip install openinference-instrumentation-openlit

Quickstart

This quickstart shows you how to view your OpenLIT traces in Phoenix.

Install required packages.

pip install arize-phoenix opentelemetry-sdk opentelemetry-exporter-otlp openlit semantic-kernel

Start Phoenix in the background as a collector. By default, it listens on http://localhost:6006. You can visit the app via a browser at the same address.

phoenix serve

Here's a simple example that demonstrates how to convert OpenLIT traces into OpenInference and view those traces in Phoenix:

import os
import grpc
from opentelemetry.sdk.trace.export import BatchSpanProcessor
from opentelemetry.exporter.otlp.proto.grpc.trace_exporter import OTLPSpanExporter
from phoenix.otel import register
from openinference.instrumentation.openlit import OpenInferenceSpanProcessor
from semantic_kernel import Kernel
from semantic_kernel.connectors.ai.open_ai import OpenAIChatCompletion
import openlit

# Set your OpenAI API key
os.environ["OPENAI_API_KEY"] = "YOUR_OPENAI_API_KEY"

# Set up the tracer provider
tracer_provider = register(
    project_name="default" #Phoenix project name
)

tracer_provider.add_span_processor(OpenInferenceSpanProcessor())
    
tracer_provider.add_span_processor(
    BatchSpanProcessor(
        OTLPSpanExporter(
            endpoint="http://localhost:4317", #if using phoenix cloud, change to phoenix cloud endpoint (phoenix cloud space -> settings -> endpoint/hostname)
            headers={},
            compression=grpc.Compression.Gzip,  # use enum instead of string
        )
    )
)

# Initialize OpenLit tracer
tracer = tracer_provider.get_tracer(__name__)
openlit.init(tracer=tracer)

# Set up Semantic Kernel with OpenLIT
kernel = Kernel()
kernel.add_service(
    OpenAIChatCompletion(
        service_id="default",
        ai_model_id="gpt-4o-mini",
    ),
)

# Define and invoke your model
result = await kernel.invoke_prompt(
    prompt="What is the national food of Yemen?",
    arguments={},
)

# Now view your converted OpenLIT traces in Phoenix!

This example:

  1. Uses OpenLIT Instrumentor to instrument the application.
  2. Defines a simple Semantic Kernel model and runs a query
  3. Queries are exported to Phoenix using a span processor.

The traces will be visible in the Phoenix UI at http://localhost:6006.

More Info

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

openinference_instrumentation_openlit-0.1.4.tar.gz (12.2 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

File details

Details for the file openinference_instrumentation_openlit-0.1.4.tar.gz.

File metadata

File hashes

Hashes for openinference_instrumentation_openlit-0.1.4.tar.gz
Algorithm Hash digest
SHA256 3a64d3c649d5f5d43fda1dc491d267ed2e3b5b17de00a65243b753359bc4765a
MD5 ae422e081ff949ecf76a1364932b8cd4
BLAKE2b-256 6a90766cd1907619aead6ad55e811ccd797c2a76923ef1958b303522740d3d4d

See more details on using hashes here.

File details

Details for the file openinference_instrumentation_openlit-0.1.4-py3-none-any.whl.

File metadata

File hashes

Hashes for openinference_instrumentation_openlit-0.1.4-py3-none-any.whl
Algorithm Hash digest
SHA256 d0d0ee605fa1d07e2555d02322fa9fd83c07d722f39d0bcdf551a2f2ccba3328
MD5 f6af46af32bc525fbf3bcc9b5cbbb718
BLAKE2b-256 0668c2bcb5c54fc0acd253c9d638574dcbbdfd5565abb2d1970e975c073291bb

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page