Skip to main content

OpenTelemetry Official Langchain instrumentation

Project description

This package provides OpenTelemetry instrumentation for LangChain LLM/chat workflows. It leverages Splunk distribution of opentelemetry-util-genai for producing telemetry in semantic convention. Core concepts, high-level usage and configuration

Status: Alpha (APIs and produced telemetry are subject to change).

Installation

Install from source:

pip install -e splunk-otel-instrumentation-langchain

This will pull in required OpenTelemetry core + opentelemetry-util-genai.

Quick Start

Manual Instrumentation (development/debugging)

from opentelemetry.instrumentation.langchain import LangChainInstrumentor
from langchain_openai import ChatOpenAI
from langchain_core.messages import HumanMessage, SystemMessage

# manual instrumentation, easy to debug in your IDE
LangChainInstrumentor().instrument()

llm = ChatOpenAI(model="gpt-4o-mini", temperature=0.0)
messages = [
    SystemMessage(content="You are a helpful assistant."),
    HumanMessage(content="What is the capital of France?"),
]
response = llm.invoke(messages)
print(response.content)

Zero-code instrumentation

in zero-code instrumentation mode, ensure you install opentelemetry-distribution and run you app with the OpenTelemetry LangChain Instrumentor enabled:

.. code:: bash

opentelemetry-instrument python your_langchain_app.py

from langchain_openai import ChatOpenAI
from langchain_core.messages import HumanMessage, SystemMessage

llm = ChatOpenAI(model="gpt-4o-mini", temperature=0.0)
messages = [
    SystemMessage(content="You are a helpful assistant."),
    HumanMessage(content="What is the capital of France?"),
]
response = llm.invoke(messages)
print(response.content)

Testing

Run the package tests (from repository root or this directory):

pytest -k langchain instrumentation-genai/opentelemetry-instrumentation-langchain-alpha/tests

(Recorded cassettes or proper API keys may be required for full integration tests.)

Contributing

Issues / PRs welcome in the main otel-splunk-python-contrib repository. This module is alpha: feedback on attribute coverage, performance, and LangChain surface expansion is especially helpful.

License

Apache 2.0

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

splunk_otel_instrumentation_langchain-0.1.4.tar.gz (29.9 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

File details

Details for the file splunk_otel_instrumentation_langchain-0.1.4.tar.gz.

File metadata

File hashes

Hashes for splunk_otel_instrumentation_langchain-0.1.4.tar.gz
Algorithm Hash digest
SHA256 aa41005d5e81ba8273ccaab5285b6cd4395aff31919d7adb71acd9cafef228d8
MD5 6e0213bfe45258ad4039d4d45f8f6125
BLAKE2b-256 dc0b1bfa26764c5d521f9b04a23e05ebdade6587c25b0f99526982aea7a481fd

See more details on using hashes here.

File details

Details for the file splunk_otel_instrumentation_langchain-0.1.4-py3-none-any.whl.

File metadata

File hashes

Hashes for splunk_otel_instrumentation_langchain-0.1.4-py3-none-any.whl
Algorithm Hash digest
SHA256 602b323f51ccd5751cc9276f8daf1ac60c8f6d0424e8f8bea0879fe7ead08c9e
MD5 328697b7cc87f4e4ffb7abe2e02b64b4
BLAKE2b-256 58beefc5e525ce631c46481cc8a309aefd8efa2032e4241d58896ea46bf0bb00

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page