Skip to main content

OpenTelemetry Official OpenAI instrumentation

Project description

pypi

This library allows tracing LLM requests and logging of messages made by the OpenAI Python API library. It also captures the duration of the operations and the number of tokens used as metrics.

Installation

If your application is already instrumented with OpenTelemetry, add this package to your requirements.

pip install opentelemetry-instrumentation-openai-v2

If you don’t have an OpenAI application, yet, try our examples which only need a valid OpenAI API key.

Check out zero-code example for a quick start.

Usage

This section describes how to set up OpenAI instrumentation if you’re setting OpenTelemetry up manually. Check out the manual example for more details.

Instrumenting all clients

When using the instrumentor, all clients will automatically trace OpenAI chat completion operations. You can also optionally capture prompts and completions as log events.

Make sure to configure OpenTelemetry tracing, logging, and events to capture all telemetry emitted by the instrumentation.

from opentelemetry.instrumentation.openai_v2 import OpenAIInstrumentor

OpenAIInstrumentor().instrument()

client = OpenAI()
response = client.chat.completions.create(
    model="gpt-4o-mini",
    messages=[
        {"role": "user", "content": "Write a short poem on open telemetry."},
    ],
)

Enabling message content

Message content such as the contents of the prompt, completion, function arguments and return values are not captured by default. To capture message content as log events, set the environment variable OTEL_INSTRUMENTATION_GENAI_CAPTURE_MESSAGE_CONTENT to true.

Uninstrument

To uninstrument clients, call the uninstrument method:

from opentelemetry.instrumentation.openai_v2 import OpenAIInstrumentor

OpenAIInstrumentor().instrument()
# ...

# Uninstrument all clients
OpenAIInstrumentor().uninstrument()

Bucket Boundaries

This section describes the explicit bucket boundaries for metrics such as token usage and operation duration, and guides users to create Views to implement them according to the semantic conventions.

The bucket boundaries are defined as follows:

  • For gen_ai.client.token.usage: [1, 4, 16, 64, 256, 1024, 4096, 16384, 65536, 262144, 1048576, 4194304, 16777216, 67108864]

  • For gen_ai.client.operation.duration: [0.01, 0.02, 0.04, 0.08, 0.16, 0.32, 0.64, 1.28, 2.56, 5.12, 10.24, 20.48, 40.96, 81.92]

To implement these bucket boundaries, you can create Views in your OpenTelemetry SDK setup. Here is an example:

from opentelemetry.sdk.metrics import MeterProvider, View
from opentelemetry.sdk.metrics.export import PeriodicExportingMetricReader
from opentelemetry.exporter.otlp.proto.grpc.metric_exporter import OTLPMetricExporter
from opentelemetry.sdk.metrics.aggregation import ExplicitBucketHistogramAggregation

views = [
    View(
        instrument_name="gen_ai.client.token.usage",
        aggregation=ExplicitBucketHistogramAggregation([1, 4, 16, 64, 256, 1024, 4096, 16384, 65536, 262144, 1048576, 4194304, 16777216, 67108864]),
    ),
    View(
        instrument_name="gen_ai.client.operation.duration",
        aggregation=ExplicitBucketHistogramAggregation([0.01, 0.02, 0.04, 0.08, 0.16, 0.32, 0.64, 1.28, 2.56, 5.12, 10.24, 20.48, 40.96, 81.92]),
    ),
]

metric_exporter = OTLPMetricExporter(endpoint="http://localhost:4317")
metric_reader = PeriodicExportingMetricReader(metric_exporter)
provider = MeterProvider(
    metric_readers=[metric_reader],
    views=views
)

from opentelemetry.sdk.metrics import set_meter_provider
set_meter_provider(provider)

For more details, refer to the OpenTelemetry GenAI Metrics documentation.

References

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

Built Distribution

File details

Details for the file opentelemetry_instrumentation_openai_v2-2.1b0.tar.gz.

File metadata

File hashes

Hashes for opentelemetry_instrumentation_openai_v2-2.1b0.tar.gz
Algorithm Hash digest
SHA256 184a95f84c28f579fbcd78b1b542d3cf75e6bd1dc9c3b8c7be4786a19cdbaf13
MD5 eb3acfb4271fd79ef7711b2f2eab08a7
BLAKE2b-256 632ea405c24ea8a5aea8f512d01da4b6f6e40ccd9e0237c04d8b9189cd169d41

See more details on using hashes here.

File details

Details for the file opentelemetry_instrumentation_openai_v2-2.1b0-py3-none-any.whl.

File metadata

File hashes

Hashes for opentelemetry_instrumentation_openai_v2-2.1b0-py3-none-any.whl
Algorithm Hash digest
SHA256 176ca5aa204fd51dd9460911f87f989686ad337abeb388e4195b84abc9f72f0b
MD5 82ca0b76be2d7b372b0813478224dcbb
BLAKE2b-256 ad9bab4b335cbd0d54bbc15067b2aeb17f947836fb718cd1aff4a38f5de8d0f9

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page