OpenInference LangChain Instrumentation
Project description
OpenInference LangChain Instrumentation
Python auto-instrumentation library for LangChain.
These traces are fully OpenTelemetry compatible and can be sent to an OpenTelemetry collector for viewing, such as arize-phoenix.
Compatibility
This instrumentation works with:
- LangChain 1.x (
langchain>=1.0.0): Modern agent framework built on LangGraph - LangChain Classic (
langchain-classic>=1.0.0): Legacy chains and tools (formerlylangchain 0.x) - All LangChain partner packages (
langchain-openai,langchain-anthropic,langchain-google-vertexai, etc.)
The instrumentation hooks into langchain-core, which is the shared foundation used by all LangChain packages.
Installation
For LangChain 1.x (Recommended for New Projects)
pip install openinference-instrumentation-langchain langchain langchain-openai
For LangChain Classic (Legacy Applications)
pip install openinference-instrumentation-langchain langchain-classic langchain-openai
For Both (Migration Scenarios)
pip install openinference-instrumentation-langchain langchain langchain-classic langchain-openai
Quickstart
Example with LangChain 1.x (New Agent Framework)
Install packages needed for this demonstration.
pip install openinference-instrumentation-langchain langchain langchain-openai arize-phoenix opentelemetry-sdk opentelemetry-exporter-otlp
Start the Phoenix app in the background as a collector. By default, it listens on http://localhost:6006. You can visit the app via a browser at the same address.
The Phoenix app does not send data over the internet. It only operates locally on your machine.
python -m phoenix.server.main serve
The following Python code sets up the LangChainInstrumentor to trace langchain and send the traces to Phoenix at the endpoint shown below.
from langchain.agents import create_agent
from langchain_openai import ChatOpenAI
from openinference.instrumentation.langchain import LangChainInstrumentor
from opentelemetry import trace as trace_api
from opentelemetry.exporter.otlp.proto.http.trace_exporter import OTLPSpanExporter
from opentelemetry.sdk import trace as trace_sdk
from opentelemetry.sdk.trace.export import ConsoleSpanExporter, SimpleSpanProcessor
endpoint = "http://127.0.0.1:6006/v1/traces"
tracer_provider = trace_sdk.TracerProvider()
trace_api.set_tracer_provider(tracer_provider)
tracer_provider.add_span_processor(SimpleSpanProcessor(OTLPSpanExporter(endpoint)))
tracer_provider.add_span_processor(SimpleSpanProcessor(ConsoleSpanExporter()))
LangChainInstrumentor().instrument()
To demonstrate tracing, we'll create a simple agent. First, configure your OpenAI credentials.
import os
os.environ["OPENAI_API_KEY"] = "<your openai key>"
Now we can create an agent and run it.
def get_weather(city: str) -> str:
"""Get the weather for a city."""
return f"The weather in {city} is sunny!"
model = ChatOpenAI(model="gpt-4")
agent = create_agent(model, tools=[get_weather])
result = agent.invoke({"messages": [{"role": "user", "content": "What's the weather in Paris?"}]})
print(result)
Example with LangChain Classic (Legacy Chains)
For legacy applications using LangChain Classic:
from langchain_classic.chains import LLMChain
from langchain_core.prompts import PromptTemplate
from langchain_openai import OpenAI
# ... (same instrumentation setup as above)
prompt_template = "Tell me a {adjective} joke"
prompt = PromptTemplate(input_variables=["adjective"], template=prompt_template)
llm = LLMChain(llm=OpenAI(), prompt=prompt, metadata={"category": "jokes"})
completion = llm.predict(adjective="funny", metadata={"variant": "funny"})
print(completion)
Visit the Phoenix app at http://localhost:6006 to see the traces.
More Info
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file openinference_instrumentation_langchain-0.1.61.tar.gz.
File metadata
- Download URL: openinference_instrumentation_langchain-0.1.61.tar.gz
- Upload date:
- Size: 75.1 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.2.0 CPython/3.14.3
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
210686a6cc42f8b16da1c450316025a11f6cf16f70b1a2dea7945dc16a98aa87
|
|
| MD5 |
b6aed40868d493b2c434e80b6c69da8e
|
|
| BLAKE2b-256 |
21606c298fd6d6778fed0bf7cddf9c97c4391f1cdfc15fe44ddeb732bd09a695
|
File details
Details for the file openinference_instrumentation_langchain-0.1.61-py3-none-any.whl.
File metadata
- Download URL: openinference_instrumentation_langchain-0.1.61-py3-none-any.whl
- Upload date:
- Size: 24.4 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.2.0 CPython/3.14.3
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
0f80198cc5937c1a8e19f15143253d59b094f36b2b18308570b4c4ddeb506020
|
|
| MD5 |
c4575c28fe573b9ab8d00eb98428a1ab
|
|
| BLAKE2b-256 |
859e5c7fa64fe28de0b5a59df2fe3007a022e9cd263e7e3ed942a18f05e14c4b
|