Skip to main content

OpenTelemetry Official Langchain instrumentation

Project description

This package provides OpenTelemetry instrumentation for LangChain LLM/chat workflows. It leverages Splunk distribution of opentelemetry-util-genai for producing telemetry in semantic convention. Core concepts, high-level usage and configuration

Status: Alpha (APIs and produced telemetry are subject to change).

Installation

Install from source:

pip install -e splunk-otel-instrumentation-langchain

This will pull in required OpenTelemetry core + opentelemetry-util-genai.

Quick Start

Manual Instrumentation (development/debugging)

from opentelemetry.instrumentation.langchain import LangChainInstrumentor
from langchain_openai import ChatOpenAI
from langchain_core.messages import HumanMessage, SystemMessage

# manual instrumentation, easy to debug in your IDE
LangChainInstrumentor().instrument()

llm = ChatOpenAI(model="gpt-4o-mini", temperature=0.0)
messages = [
    SystemMessage(content="You are a helpful assistant."),
    HumanMessage(content="What is the capital of France?"),
]
response = llm.invoke(messages)
print(response.content)

Zero-code instrumentation

in zero-code instrumentation mode, ensure you install opentelemetry-distribution and run you app with the OpenTelemetry LangChain Instrumentor enabled:

.. code:: bash

opentelemetry-instrument python your_langchain_app.py

from langchain_openai import ChatOpenAI
from langchain_core.messages import HumanMessage, SystemMessage

llm = ChatOpenAI(model="gpt-4o-mini", temperature=0.0)
messages = [
    SystemMessage(content="You are a helpful assistant."),
    HumanMessage(content="What is the capital of France?"),
]
response = llm.invoke(messages)
print(response.content)

Testing

Run the package tests (from repository root or this directory):

pytest -k langchain instrumentation-genai/opentelemetry-instrumentation-langchain-alpha/tests

(Recorded cassettes or proper API keys may be required for full integration tests.)

Contributing

Issues / PRs welcome in the main otel-splunk-python-contrib repository. This module is alpha: feedback on attribute coverage, performance, and LangChain surface expansion is especially helpful.

License

Apache 2.0

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

splunk_otel_instrumentation_langchain-0.1.5.tar.gz (31.3 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

File details

Details for the file splunk_otel_instrumentation_langchain-0.1.5.tar.gz.

File metadata

File hashes

Hashes for splunk_otel_instrumentation_langchain-0.1.5.tar.gz
Algorithm Hash digest
SHA256 c785f5c9d69d3fc12463a3702d74e29f8394e5fadde783d5674a2f4b773bbbfd
MD5 cbc404032c2a08ef9507443e1c55d4a3
BLAKE2b-256 f1b7b827b3fa2d9fbd7d4fd6a506803a3151787983fee69053efaf1b29c6ef4c

See more details on using hashes here.

File details

Details for the file splunk_otel_instrumentation_langchain-0.1.5-py3-none-any.whl.

File metadata

File hashes

Hashes for splunk_otel_instrumentation_langchain-0.1.5-py3-none-any.whl
Algorithm Hash digest
SHA256 5b7cc9cd31ed0ed2c5b3c72b72803feb2f3387ad8c896fd0de5058e3363fc2d7
MD5 b02749c867d44fbad9a1f7fc4d8cec7e
BLAKE2b-256 9be944f46088e9879f7af6f3b21e76e657777a6a9bc1d702418fca8f3d56035e

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page