Skip to main content

openlit -> GenAI translator emitter for OpenTelemetry GenAI

Project description

This package automatically translates openlit sdk instrumented spans into OpenTelemetry GenAI semantic conventions. It intercepts spans with `gen_ai.*` openlit specific attributes and creates corresponding spans with gen_ai.* semantic convention compliant attributes, enabling seamless integration between openlit instrumentation and GenAI observability tools.

Mapping Table

Old Key (OpenLit)

New Key (OTel SemConv)

gen_ai.completion.0.content

gen_ai.output.messages

gen_ai.prompt.0.content

gen_ai.input.messages

gen_ai.prompt

gen_ai.input.messages

gen_ai.completion

gen_ai.output.messages

gen_ai.content.prompt

gen_ai.input.messages

gen_ai.content.completion

gen_ai.output.messages

gen_ai.request.embedding_dimension

gen_ai.embeddings.dimension.count

gen_ai.token.usage.input

gen_ai.usage.input_tokens

gen_ai.token.usage.output

gen_ai.usage.output_tokens

gen_ai.llm.provider

gen_ai.provider.name

gen_ai.llm.model

gen_ai.request.model

gen_ai.llm.temperature

gen_ai.request.temperature

gen_ai.llm.max_tokens

gen_ai.request.max_tokens

gen_ai.llm.top_p

gen_ai.request.top_p

gen_ai.operation.type

gen_ai.operation.name

gen_ai.output_messages

gen_ai.output.messages

gen_ai.session.id

gen_ai.conversation.id

gen_ai.openai.thread.id

gen_ai.conversation.id

gen_ai.tool.args

gen_ai.tool.call.arguments

gen_ai.tool.result

gen_ai.tool.call.result

gen_ai.vectordb.name

db.system.name

gen_ai.vectordb.search.query

db.query.text

gen_ai.vectordb.search.results_count

db.response.returned_rows

Installation

pip install opentelemetry-util-genai-openlit-translator

Quick Start (Automatic Registration)

The easiest way to use the translator is to simply import it - no manual setup required!

from openai import OpenAI
import openlit
from dotenv import load_dotenv
import os
import traceback

load_dotenv()

try:
   openlit.init(otlp_endpoint="http://0.0.0.0:4318")

   client = OpenAI(
      api_key=os.getenv("OPENAI_API_KEY")
   )

   chat_completion = client.chat.completions.create(
      messages=[
            {
               "role": "user",
               "content": "What is LLM Observability?",
            }
      ],
      model="gpt-3.5-turbo",
   )
   print("response:", chat_completion.choices[0].message.content)
except Exception as e:
   print(f"An error occurred: {e}")
   traceback.print_exc()

Tests

pytest util/opentelemetry-util-genai-openlit-translator/tests

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

File details

Details for the file splunk_otel_util_genai_translator_openlit-0.1.2.tar.gz.

File metadata

File hashes

Hashes for splunk_otel_util_genai_translator_openlit-0.1.2.tar.gz
Algorithm Hash digest
SHA256 80496ab48783544ae13cafb878674ca329069b8dc229dd85197ac3416ea8ec56
MD5 ea2e50707c164b0d6142f50e52f997f2
BLAKE2b-256 e67d42a451cff80eca6b7a5258f4823582ad288e770c0feeabde11d1e480b16a

See more details on using hashes here.

File details

Details for the file splunk_otel_util_genai_translator_openlit-0.1.2-py3-none-any.whl.

File metadata

File hashes

Hashes for splunk_otel_util_genai_translator_openlit-0.1.2-py3-none-any.whl
Algorithm Hash digest
SHA256 9c639fb7045f7fb47224675e3bcb3c3249498e52c3afc9adcc9aa4c3b98c0ad4
MD5 bfed315bd3ac77cf73cce440bf8f40a9
BLAKE2b-256 f20033cfad751d04e85e253a0c6914d73ef82d03f078f86668a9cce3a036a796

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page