Skip to main content

OpenTelemetry Official OpenAI instrumentation

Project description

pypi

This library allows tracing LLM requests and logging of messages made by the OpenAI Python API library. It also captures the duration of the operations and the number of tokens used as metrics.

Many LLM platforms support the OpenAI SDK. This means systems such as the following are observable with this instrumentation when accessed using it:

OpenAI Compatible Platforms

Name

gen_ai.system

Azure OpenAI

az.ai.openai

Gemini

gemini

Perplexity

perplexity

xAI (Compatible with Anthropic)

xai

DeepSeek

deepseek

Groq

groq

MistralAI

mistral_ai

Installation

If your application is already instrumented with OpenTelemetry, add this package to your requirements.

pip install opentelemetry-instrumentation-openai-v2

If you don’t have an OpenAI application, yet, try our examples which only need a valid OpenAI API key.

Check out zero-code example for a quick start.

Usage

This section describes how to set up OpenAI instrumentation if you’re setting OpenTelemetry up manually. Check out the manual example for more details.

Instrumenting all clients

When using the instrumentor, all clients will automatically trace OpenAI operations including chat completions and embeddings. You can also optionally capture prompts and completions as log events.

Make sure to configure OpenTelemetry tracing, logging, and events to capture all telemetry emitted by the instrumentation.

from opentelemetry.instrumentation.openai_v2 import OpenAIInstrumentor

OpenAIInstrumentor().instrument()

client = OpenAI()
# Chat completion example
response = client.chat.completions.create(
    model="gpt-4o-mini",
    messages=[
        {"role": "user", "content": "Write a short poem on open telemetry."},
    ],
)

# Embeddings example
embedding_response = client.embeddings.create(
    model="text-embedding-3-small",
    input="Generate vector embeddings for this text"
)

Enabling message content

Message content such as the contents of the prompt, completion, function arguments and return values are not captured by default. To capture message content as log events, set the environment variable OTEL_INSTRUMENTATION_GENAI_CAPTURE_MESSAGE_CONTENT to true.

Uninstrument

To uninstrument clients, call the uninstrument method:

from opentelemetry.instrumentation.openai_v2 import OpenAIInstrumentor

OpenAIInstrumentor().instrument()
# ...

# Uninstrument all clients
OpenAIInstrumentor().uninstrument()

References

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

opentelemetry_instrumentation_openai_v2-2.3b0.tar.gz (170.8 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

File details

Details for the file opentelemetry_instrumentation_openai_v2-2.3b0.tar.gz.

File metadata

File hashes

Hashes for opentelemetry_instrumentation_openai_v2-2.3b0.tar.gz
Algorithm Hash digest
SHA256 5de9d70cc9536eea1fe48ea016e0c5f25735fa9a13709076a64b20657fadb6ba
MD5 a557ce2dec8c81a03f06ff49efc4f968
BLAKE2b-256 384e21f8cd16ccb471dd217ed85eb817796a10c4f2718ae2c91e752a57180cf0

See more details on using hashes here.

File details

Details for the file opentelemetry_instrumentation_openai_v2-2.3b0-py3-none-any.whl.

File metadata

File hashes

Hashes for opentelemetry_instrumentation_openai_v2-2.3b0-py3-none-any.whl
Algorithm Hash digest
SHA256 c6aca87be0da0289ea1d8167fea4b0f227ea5ef0e90496e2822121e47340d36a
MD5 faa8a602fd399f269cd26e5ca82ee268
BLAKE2b-256 f0027ff0a9282520592772a356dd39d1559f3726610ccc3854a2f598b756c66f

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page