Skip to main content

OpenInference LangChain Instrumentation

Project description

OpenInference LangChain Instrumentation

Python auto-instrumentation library for LangChain.

These traces are fully OpenTelemetry compatible and can be sent to an OpenTelemetry collector for viewing, such as arize-phoenix.

pypi

Compatibility

This instrumentation works with:

  • LangChain 1.x (langchain>=1.0.0): Modern agent framework built on LangGraph
  • LangChain Classic (langchain-classic>=1.0.0): Legacy chains and tools (formerly langchain 0.x)
  • All LangChain partner packages (langchain-openai, langchain-anthropic, langchain-google-vertexai, etc.)

The instrumentation hooks into langchain-core, which is the shared foundation used by all LangChain packages.

Installation

For LangChain 1.x (Recommended for New Projects)

pip install openinference-instrumentation-langchain langchain langchain-openai

For LangChain Classic (Legacy Applications)

pip install openinference-instrumentation-langchain langchain-classic langchain-openai

For Both (Migration Scenarios)

pip install openinference-instrumentation-langchain langchain langchain-classic langchain-openai

Quickstart

Example with LangChain 1.x (New Agent Framework)

Install packages needed for this demonstration.

pip install openinference-instrumentation-langchain langchain langchain-openai arize-phoenix opentelemetry-sdk opentelemetry-exporter-otlp

Start the Phoenix app in the background as a collector. By default, it listens on http://localhost:6006. You can visit the app via a browser at the same address.

The Phoenix app does not send data over the internet. It only operates locally on your machine.

python -m phoenix.server.main serve

The following Python code sets up the LangChainInstrumentor to trace langchain and send the traces to Phoenix at the endpoint shown below.

from langchain.agents import create_agent
from langchain_openai import ChatOpenAI
from openinference.instrumentation.langchain import LangChainInstrumentor
from opentelemetry import trace as trace_api
from opentelemetry.exporter.otlp.proto.http.trace_exporter import OTLPSpanExporter
from opentelemetry.sdk import trace as trace_sdk
from opentelemetry.sdk.trace.export import ConsoleSpanExporter, SimpleSpanProcessor

endpoint = "http://127.0.0.1:6006/v1/traces"
tracer_provider = trace_sdk.TracerProvider()
trace_api.set_tracer_provider(tracer_provider)
tracer_provider.add_span_processor(SimpleSpanProcessor(OTLPSpanExporter(endpoint)))
tracer_provider.add_span_processor(SimpleSpanProcessor(ConsoleSpanExporter()))

LangChainInstrumentor().instrument()

To demonstrate tracing, we'll create a simple agent. First, configure your OpenAI credentials.

import os

os.environ["OPENAI_API_KEY"] = "<your openai key>"

Now we can create an agent and run it.

def get_weather(city: str) -> str:
    """Get the weather for a city."""
    return f"The weather in {city} is sunny!"

model = ChatOpenAI(model="gpt-4")
agent = create_agent(model, tools=[get_weather])
result = agent.invoke({"messages": [{"role": "user", "content": "What's the weather in Paris?"}]})
print(result)

Example with LangChain Classic (Legacy Chains)

For legacy applications using LangChain Classic:

from langchain_classic.chains import LLMChain
from langchain_core.prompts import PromptTemplate
from langchain_openai import OpenAI

# ... (same instrumentation setup as above)

prompt_template = "Tell me a {adjective} joke"
prompt = PromptTemplate(input_variables=["adjective"], template=prompt_template)
llm = LLMChain(llm=OpenAI(), prompt=prompt, metadata={"category": "jokes"})
completion = llm.predict(adjective="funny", metadata={"variant": "funny"})
print(completion)

Visit the Phoenix app at http://localhost:6006 to see the traces.

More Info

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

File details

Details for the file openinference_instrumentation_langchain-0.1.59.tar.gz.

File metadata

File hashes

Hashes for openinference_instrumentation_langchain-0.1.59.tar.gz
Algorithm Hash digest
SHA256 5a49d6c833e2462a1dc8f5b6dc79fb6c86164d57feeb2cc66ada404751acf48d
MD5 97a74460e01f21a45560d3941e61ee0a
BLAKE2b-256 bd48f36d8bb49e7aad0edc1cc7985369f8322fff11c1628d56256a11013cc8ab

See more details on using hashes here.

File details

Details for the file openinference_instrumentation_langchain-0.1.59-py3-none-any.whl.

File metadata

File hashes

Hashes for openinference_instrumentation_langchain-0.1.59-py3-none-any.whl
Algorithm Hash digest
SHA256 7432234b3cf84949ea447ab03912184073d63bfa07a1f10ca59c7d1fb2d2326f
MD5 e154f6067272777c9587ad0eac907678
BLAKE2b-256 1390ed6e921ebf4c5821727f20ea5d4e751b1634e72d0dbfd039662450d6d2be

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page