Skip to main content

openlit -> GenAI translator emitter for OpenTelemetry GenAI

Project description

This package automatically translates openlit sdk instrumented spans into OpenTelemetry GenAI semantic conventions. It intercepts spans with `gen_ai.*` openlit specific attributes and creates corresponding spans with gen_ai.* semantic convention compliant attributes, enabling seamless integration between openlit instrumentation and GenAI observability tools.

Mapping Table

Old Key (OpenLit)

New Key (OTel SemConv)

gen_ai.completion.0.content

gen_ai.output.messages

gen_ai.prompt.0.content

gen_ai.input.messages

gen_ai.prompt

gen_ai.input.messages

gen_ai.completion

gen_ai.output.messages

gen_ai.content.prompt

gen_ai.input.messages

gen_ai.content.completion

gen_ai.output.messages

gen_ai.request.embedding_dimension

gen_ai.embeddings.dimension.count

gen_ai.token.usage.input

gen_ai.usage.input_tokens

gen_ai.token.usage.output

gen_ai.usage.output_tokens

gen_ai.llm.provider

gen_ai.provider.name

gen_ai.llm.model

gen_ai.request.model

gen_ai.llm.temperature

gen_ai.request.temperature

gen_ai.llm.max_tokens

gen_ai.request.max_tokens

gen_ai.llm.top_p

gen_ai.request.top_p

gen_ai.operation.type

gen_ai.operation.name

gen_ai.output_messages

gen_ai.output.messages

gen_ai.session.id

gen_ai.conversation.id

gen_ai.openai.thread.id

gen_ai.conversation.id

gen_ai.tool.args

gen_ai.tool.call.arguments

gen_ai.tool.result

gen_ai.tool.call.result

gen_ai.vectordb.name

db.system.name

gen_ai.vectordb.search.query

db.query.text

gen_ai.vectordb.search.results_count

db.response.returned_rows

Installation

pip install opentelemetry-util-genai-openlit-translator

Quick Start (Automatic Registration)

The easiest way to use the translator is to simply import it - no manual setup required!

from openai import OpenAI
import openlit
from dotenv import load_dotenv
import os
import traceback

load_dotenv()

try:
   openlit.init(otlp_endpoint="http://0.0.0.0:4318")

   client = OpenAI(
      api_key=os.getenv("OPENAI_API_KEY")
   )

   chat_completion = client.chat.completions.create(
      messages=[
            {
               "role": "user",
               "content": "What is LLM Observability?",
            }
      ],
      model="gpt-3.5-turbo",
   )
   print("response:", chat_completion.choices[0].message.content)
except Exception as e:
   print(f"An error occurred: {e}")
   traceback.print_exc()

Tests

pytest util/opentelemetry-util-genai-openlit-translator/tests

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

File details

Details for the file splunk_otel_util_genai_translator_openlit-0.1.1.tar.gz.

File metadata

File hashes

Hashes for splunk_otel_util_genai_translator_openlit-0.1.1.tar.gz
Algorithm Hash digest
SHA256 584a9748e45a8d8603a129815be29e2f128a17cbcdf40bb8c76f708f99ee1737
MD5 2597e459306b5a42f6e987e892ef905d
BLAKE2b-256 378fe66ed6eafbb48e5613886c0eb4520d19941a4c1de1aa0b5fadc57f070888

See more details on using hashes here.

File details

Details for the file splunk_otel_util_genai_translator_openlit-0.1.1-py3-none-any.whl.

File metadata

File hashes

Hashes for splunk_otel_util_genai_translator_openlit-0.1.1-py3-none-any.whl
Algorithm Hash digest
SHA256 ec9576db262f3bb1c7a54e9eb95290f80418b2d1e9510b7e898780517726c7f9
MD5 fe93c2b226577fc628b5d1b3ad2b3979
BLAKE2b-256 e41a5dd2b92150546f56dbfb9fd4fcb96bb89d45c41b7465b7b34efbebf02a45

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page