Skip to main content

OpenInference instrumentation utilities

Project description

OpenInference Instrumentation

PyPI Version

Utility functions for OpenInference instrumentation.

Installation

pip install openinference-instrumentation

Customizing Spans

The openinference-instrumentation package offers utilities to track important application metadata such as sessions and metadata using Python context managers:

  • using_session: to specify a session ID to track and group a multi-turn conversation with a user
  • using_user: to specify a user ID to track different conversations with a given user
  • using_metadata: to add custom metadata, that can provide extra information that supports a wide range of operational needs
  • using_tag: to add tags, to help filter on specific keywords
  • using_prompt_template: to reflect the prompt template used, with its version and variables. This is useful for prompt template management
  • using_attributes: it helps handling multiple of the previous options at once in a concise manner

For example:

from openinference.instrumentation import using_attributes
tags = ["business_critical", "simple", ...]
metadata = {
    "country": "United States",
    "topic":"weather",
    ...
}
prompt_template = "Please describe the weather forecast for {city} on {date}"
prompt_template_variables = {"city": "Johannesburg", "date":"July 11"}
prompt_template_version = "v1.0"
with using_attributes(
    session_id="my-session-id",
    user_id="my-user-id",
    metadata=metadata,
    tags=tags,
    prompt_template=prompt_template,
    prompt_template_version=prompt_template_version,
    prompt_template_variables=prompt_template_variables,
):
    # Calls within this block will generate spans with the attributes:
    # "session.id" = "my-session-id"
    # "user.id" = "my-user-id"
    # "metadata" = "{\"key-1\": value_1, \"key-2\": value_2, ... }" # JSON serialized
    # "tag.tags" = "["tag_1","tag_2",...]"
    # "llm.prompt_template.template" = "Please describe the weather forecast for {city} on {date}"
    # "llm.prompt_template.variables" = "{\"city\": \"Johannesburg\", \"date\": \"July 11\"}" # JSON serialized
    # "llm.prompt_template.version " = "v1.0"
    ...

You can read more about this in our docs.

Tracing Configuration

This package contains the central TraceConfig class, which lets you specify a tracing configuration that lets you control settings like data privacy and payload sizes. For instance, you may want to keep sensitive information from being logged for security reasons, or you may want to limit the size of the base64 encoded images logged to reduced payload size.

In addition, you an also use environment variables, read more here. The following is an example of using the TraceConfig object:

from openinference.instrumentation import TraceConfig
config = TraceConfig(
    hide_inputs=hide_inputs,
    hide_outputs=hide_outputs,
    hide_input_messages=hide_input_messages,
    hide_output_messages=hide_output_messages,
    hide_input_images=hide_input_images,
    hide_input_text=hide_input_text,
    hide_output_text=hide_output_text,
    base64_image_max_length=base64_image_max_length,
)
tracer_provider=...
# This example uses the OpenAIInstrumentor, but it works with any of our auto instrumentors
OpenAIInstrumentor().instrument(tracer_provider=tracer_provider, config=config)

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

openinference_instrumentation-0.1.16.tar.gz (11.5 kB view details)

Uploaded Source

Built Distribution

File details

Details for the file openinference_instrumentation-0.1.16.tar.gz.

File metadata

File hashes

Hashes for openinference_instrumentation-0.1.16.tar.gz
Algorithm Hash digest
SHA256 92952070c287d790405adcb83497d7f33b118f142863dc35e271608488fe437d
MD5 80a002dc0a040694e24e9ec620d2d015
BLAKE2b-256 e7a26ee1937da0bca2703406b76b4f572d768c74dee4aa68b6cc74df8bd52ba0

See more details on using hashes here.

File details

Details for the file openinference_instrumentation-0.1.16-py3-none-any.whl.

File metadata

File hashes

Hashes for openinference_instrumentation-0.1.16-py3-none-any.whl
Algorithm Hash digest
SHA256 4597e4cc359ff921afa12a5e8df98bf2f2968f14a2ffbb78d22bf0d7ce07cc2c
MD5 c5237713e91b4a7bb52f23bd1bfea68a
BLAKE2b-256 88d38764f124ecf54643544c33752072b4dff7fcefbc6e02ac88deb06c460b96

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page