OpenInference LangChain Instrumentation
Project description
OpenInference LangChain Instrumentation
Python auto-instrumentation library for LangChain.
These traces are fully OpenTelemetry compatible and can be sent to an OpenTelemetry collector for viewing, such as arize-phoenix
.
Installation
pip install openinference-instrumentation-langchain
Quickstart
Install packages needed for this demonstration.
pip install openinference-instrumentation-langchain langchain arize-phoenix opentelemetry-sdk opentelemetry-exporter-otlp
Start the Phoenix app in the background as a collector. By default, it listens on http://localhost:6006
. You can visit the app via a browser at the same address.
The Phoenix app does not send data over the internet. It only operates locally on your machine.
python -m phoenix.server.main serve
The following Python code sets up the LangChainInstrumentor
to trace langchain
and send the traces to Phoenix at the endpoint shown below.
from langchain.chains import LLMChain
from langchain_core.prompts import PromptTemplate
from langchain_openai import OpenAI
from openinference.instrumentation.langchain import LangChainInstrumentor
from opentelemetry import trace as trace_api
from opentelemetry.exporter.otlp.proto.http.trace_exporter import OTLPSpanExporter
from opentelemetry.sdk import trace as trace_sdk
from opentelemetry.sdk.trace.export import ConsoleSpanExporter, SimpleSpanProcessor
endpoint = "http://127.0.0.1:6006/v1/traces"
tracer_provider = trace_sdk.TracerProvider()
trace_api.set_tracer_provider(tracer_provider)
tracer_provider.add_span_processor(SimpleSpanProcessor(OTLPSpanExporter(endpoint)))
tracer_provider.add_span_processor(SimpleSpanProcessor(ConsoleSpanExporter()))
LangChainInstrumentor().instrument()
To demonstrate langchain
tracing, we'll make a simple chain to tell a joke. First, configure your OpenAI credentials.
import os
os.environ["OPENAI_API_KEY"] = "<your openai key>"
Now we can create a chain and run it.
prompt_template = "Tell me a {adjective} joke"
prompt = PromptTemplate(input_variables=["adjective"], template=prompt_template)
llm = LLMChain(llm=OpenAI(), prompt=prompt, metadata={"category": "jokes"})
completion = llm.predict(adjective="funny", metadata={"variant": "funny"})
print(completion)
Visit the Phoenix app at http://localhost:6006
to see the traces.
More Info
More details about tracing with OpenInference and Phoenix can be found in the Phoenix documentation.
For AI/ML observability solutions in production, including a cloud-based trace collector, visit Arize.
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
File details
Details for the file openinference_instrumentation_langchain-0.1.26.tar.gz
.
File metadata
- Download URL: openinference_instrumentation_langchain-0.1.26.tar.gz
- Upload date:
- Size: 43.5 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/5.1.1 CPython/3.12.4
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 | e73fdc9b09fc4127610f0616641088902cb9873ddae265b34efb86793b17b5f6 |
|
MD5 | f43537e94358127297458622a556262e |
|
BLAKE2b-256 | 085f6f05470846de8a476d770370e19377c66d80eefe20578a50d6db8132f077 |
File details
Details for the file openinference_instrumentation_langchain-0.1.26-py3-none-any.whl
.
File metadata
- Download URL: openinference_instrumentation_langchain-0.1.26-py3-none-any.whl
- Upload date:
- Size: 16.7 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/5.1.1 CPython/3.12.4
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 | a40f71286a8bedf20253fce7c83a078f81b071ec0875ac1b8fcafa30dc038896 |
|
MD5 | 00cc13950811f5a650c8c2038e4110fc |
|
BLAKE2b-256 | 13cb9bdd38286f2e81ebf590396a993ad87138ee597e116fa55f85290fe65dd2 |