Skip to main content

OpenTelemetry Official Langchain instrumentation

Project description

This package provides OpenTelemetry instrumentation for LangChain LLM/chat workflows. It leverages Splunk distribution of opentelemetry-util-genai for producing telemetry in semantic convention. Core concepts, high-level usage and configuration

Status: Alpha (APIs and produced telemetry are subject to change).

Installation

Install from source:

pip install -e splunk-otel-instrumentation-langchain

This will pull in required OpenTelemetry core + opentelemetry-util-genai.

Quick Start

Manual Instrumentation (development/debugging)

from opentelemetry.instrumentation.langchain import LangChainInstrumentor
from langchain_openai import ChatOpenAI
from langchain_core.messages import HumanMessage, SystemMessage

# manual instrumentation, easy to debug in your IDE
LangChainInstrumentor().instrument()

llm = ChatOpenAI(model="gpt-4o-mini", temperature=0.0)
messages = [
    SystemMessage(content="You are a helpful assistant."),
    HumanMessage(content="What is the capital of France?"),
]
response = llm.invoke(messages)
print(response.content)

Zero-code instrumentation

in zero-code instrumentation mode, ensure you install opentelemetry-distribution and run you app with the OpenTelemetry LangChain Instrumentor enabled:

.. code:: bash

opentelemetry-instrument python your_langchain_app.py

from langchain_openai import ChatOpenAI
from langchain_core.messages import HumanMessage, SystemMessage

llm = ChatOpenAI(model="gpt-4o-mini", temperature=0.0)
messages = [
    SystemMessage(content="You are a helpful assistant."),
    HumanMessage(content="What is the capital of France?"),
]
response = llm.invoke(messages)
print(response.content)

Testing

Run the package tests (from repository root or this directory):

pytest -k langchain instrumentation-genai/opentelemetry-instrumentation-langchain-alpha/tests

(Recorded cassettes or proper API keys may be required for full integration tests.)

Contributing

Issues / PRs welcome in the main otel-splunk-python-contrib repository. This module is alpha: feedback on attribute coverage, performance, and LangChain surface expansion is especially helpful.

License

Apache 2.0

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

splunk_otel_instrumentation_langchain-0.1.3.tar.gz (29.9 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

File details

Details for the file splunk_otel_instrumentation_langchain-0.1.3.tar.gz.

File metadata

File hashes

Hashes for splunk_otel_instrumentation_langchain-0.1.3.tar.gz
Algorithm Hash digest
SHA256 711a135962a5a823128a591fb954555e5ace0fc599d787510b90a081fb14b6c1
MD5 dc51ffb0fd935ed4d1448db0427a7569
BLAKE2b-256 7d9ef6fa653ef580fd11a302fb52346564246c4aca2105785a69fa724d6780b1

See more details on using hashes here.

File details

Details for the file splunk_otel_instrumentation_langchain-0.1.3-py3-none-any.whl.

File metadata

File hashes

Hashes for splunk_otel_instrumentation_langchain-0.1.3-py3-none-any.whl
Algorithm Hash digest
SHA256 2a97c12e36e05f695c17e9cd1250acaf77245b176c226bbf9da8a9848badf380
MD5 da6b406d599abd0a84efb87a62a07567
BLAKE2b-256 26dab014bab650c590b0fa81c2dee42093b151de3e38cea6cf3bf176f20b1de5

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page