openlit -> GenAI translator emitter for OpenTelemetry GenAI
Project description
This package automatically translates openlit sdk instrumented spans into OpenTelemetry GenAI semantic conventions. It intercepts spans with `gen_ai.*` openlit specific attributes and creates corresponding spans with gen_ai.* semantic convention compliant attributes, enabling seamless integration between openlit instrumentation and GenAI observability tools.
Mapping Table
Old Key (OpenLit) |
New Key (OTel SemConv) |
|---|---|
gen_ai.completion.0.content |
gen_ai.output.messages |
gen_ai.prompt.0.content |
gen_ai.input.messages |
gen_ai.prompt |
gen_ai.input.messages |
gen_ai.completion |
gen_ai.output.messages |
gen_ai.content.prompt |
gen_ai.input.messages |
gen_ai.content.completion |
gen_ai.output.messages |
gen_ai.request.embedding_dimension |
gen_ai.embeddings.dimension.count |
gen_ai.token.usage.input |
gen_ai.usage.input_tokens |
gen_ai.token.usage.output |
gen_ai.usage.output_tokens |
gen_ai.llm.provider |
gen_ai.provider.name |
gen_ai.llm.model |
gen_ai.request.model |
gen_ai.llm.temperature |
gen_ai.request.temperature |
gen_ai.llm.max_tokens |
gen_ai.request.max_tokens |
gen_ai.llm.top_p |
gen_ai.request.top_p |
gen_ai.operation.type |
gen_ai.operation.name |
gen_ai.output_messages |
gen_ai.output.messages |
gen_ai.session.id |
gen_ai.conversation.id |
gen_ai.openai.thread.id |
gen_ai.conversation.id |
gen_ai.tool.args |
gen_ai.tool.call.arguments |
gen_ai.tool.result |
gen_ai.tool.call.result |
gen_ai.vectordb.name |
db.system.name |
gen_ai.vectordb.search.query |
db.query.text |
gen_ai.vectordb.search.results_count |
db.response.returned_rows |
Installation
pip install opentelemetry-util-genai-openlit-translator
Quick Start (Automatic Registration)
The easiest way to use the translator is to simply import it - no manual setup required!
from openai import OpenAI
import openlit
from dotenv import load_dotenv
import os
import traceback
load_dotenv()
try:
openlit.init(otlp_endpoint="http://0.0.0.0:4318")
client = OpenAI(
api_key=os.getenv("OPENAI_API_KEY")
)
chat_completion = client.chat.completions.create(
messages=[
{
"role": "user",
"content": "What is LLM Observability?",
}
],
model="gpt-3.5-turbo",
)
print("response:", chat_completion.choices[0].message.content)
except Exception as e:
print(f"An error occurred: {e}")
traceback.print_exc()
Tests
pytest util/opentelemetry-util-genai-openlit-translator/tests
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file splunk_otel_util_genai_translator_openlit-0.1.1.tar.gz.
File metadata
- Download URL: splunk_otel_util_genai_translator_openlit-0.1.1.tar.gz
- Upload date:
- Size: 40.9 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: Hatch/1.16.3 cpython/3.11.14 HTTPX/0.28.1
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
584a9748e45a8d8603a129815be29e2f128a17cbcdf40bb8c76f708f99ee1737
|
|
| MD5 |
2597e459306b5a42f6e987e892ef905d
|
|
| BLAKE2b-256 |
378fe66ed6eafbb48e5613886c0eb4520d19941a4c1de1aa0b5fadc57f070888
|
File details
Details for the file splunk_otel_util_genai_translator_openlit-0.1.1-py3-none-any.whl.
File metadata
- Download URL: splunk_otel_util_genai_translator_openlit-0.1.1-py3-none-any.whl
- Upload date:
- Size: 27.4 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: Hatch/1.16.3 cpython/3.11.14 HTTPX/0.28.1
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
ec9576db262f3bb1c7a54e9eb95290f80418b2d1e9510b7e898780517726c7f9
|
|
| MD5 |
fe93c2b226577fc628b5d1b3ad2b3979
|
|
| BLAKE2b-256 |
e41a5dd2b92150546f56dbfb9fd4fcb96bb89d45c41b7465b7b34efbebf02a45
|