Skip to main content

OpenTelemetry Official Langchain instrumentation

Project description

This package provides OpenTelemetry instrumentation for LangChain LLM/chat workflows. It leverages Splunk distribution of opentelemetry-util-genai for producing telemetry in semantic convention. Core concepts, high-level usage and configuration

Status: Alpha (APIs and produced telemetry are subject to change).

Installation

Install from source:

pip install -e splunk-otel-instrumentation-langchain

This will pull in required OpenTelemetry core + opentelemetry-util-genai.

Quick Start

Manual Instrumentation (development/debugging)

from opentelemetry.instrumentation.langchain import LangChainInstrumentor
from langchain_openai import ChatOpenAI
from langchain_core.messages import HumanMessage, SystemMessage

# manual instrumentation, easy to debug in your IDE
LangChainInstrumentor().instrument()

llm = ChatOpenAI(model="gpt-4o-mini", temperature=0.0)
messages = [
    SystemMessage(content="You are a helpful assistant."),
    HumanMessage(content="What is the capital of France?"),
]
response = llm.invoke(messages)
print(response.content)

Zero-code instrumentation

in zero-code instrumentation mode, ensure you install opentelemetry-distribution and run you app with the OpenTelemetry LangChain Instrumentor enabled:

.. code:: bash

opentelemetry-instrument python your_langchain_app.py

from langchain_openai import ChatOpenAI
from langchain_core.messages import HumanMessage, SystemMessage

llm = ChatOpenAI(model="gpt-4o-mini", temperature=0.0)
messages = [
    SystemMessage(content="You are a helpful assistant."),
    HumanMessage(content="What is the capital of France?"),
]
response = llm.invoke(messages)
print(response.content)

Testing

Run the package tests (from repository root or this directory):

pytest -k langchain instrumentation-genai/opentelemetry-instrumentation-langchain-alpha/tests

(Recorded cassettes or proper API keys may be required for full integration tests.)

Contributing

Issues / PRs welcome in the main otel-splunk-python-contrib repository. This module is alpha: feedback on attribute coverage, performance, and LangChain surface expansion is especially helpful.

License

Apache 2.0

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

splunk_otel_instrumentation_langchain-0.1.7.tar.gz (31.5 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

File details

Details for the file splunk_otel_instrumentation_langchain-0.1.7.tar.gz.

File metadata

File hashes

Hashes for splunk_otel_instrumentation_langchain-0.1.7.tar.gz
Algorithm Hash digest
SHA256 8430888304e518d32835a428fe87a591984c6e8a004c1e212f4300344094d810
MD5 a53deeb496c450074404a413bc7cda03
BLAKE2b-256 02bea175dedc9fb37557870067e68b41f7032f30f688bbe5a6714c4cc9958208

See more details on using hashes here.

File details

Details for the file splunk_otel_instrumentation_langchain-0.1.7-py3-none-any.whl.

File metadata

File hashes

Hashes for splunk_otel_instrumentation_langchain-0.1.7-py3-none-any.whl
Algorithm Hash digest
SHA256 66e752fee720ee1df86bef73e3335a8d1ea32cd80a648d9fe2183d96372f6e6b
MD5 090de506d82a9514481f5413e5a1b040
BLAKE2b-256 888382d6a49d381115fe28ce475daa35eed86476ca88db8fcf0744d7f02ce096

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page