OpenTelemetry instrumentation for Langchain
Project description
LangChain & LangGraph OpenTelemetry Integration
Overview
This integration provides comprehensive OpenTelemetry instrumentation for both LangChain and LangGraph frameworks. It enables detailed tracing and monitoring of applications built with these frameworks.
Installation
Install traceAI LangChain
pip install traceAI-langchain
For LangGraph support (optional)
pip install traceAI-langchain[langgraph]
Install LangChain OpenAI
pip install langchain-openai
Environment Variables
Set up your environment variables to authenticate with FutureAGI.
import os
os.environ["FI_API_KEY"] = FI_API_KEY
os.environ["FI_SECRET_KEY"] = FI_SECRET_KEY
os.environ["OPENAI_API_KEY"] = OPENAI_API_KEY
LangChain Quickstart
Register Tracer Provider
Set up the trace provider to establish the observability pipeline:
from fi_instrumentation import register
from fi_instrumentation.fi_types import ProjectType
trace_provider = register(
project_type=ProjectType.OBSERVE,
project_name="langchain_app",
session_name="chat-bot"
)
Configure LangChain Instrumentation
Instrument the LangChain client to enable telemetry collection:
from traceai_langchain import LangChainInstrumentor
LangChainInstrumentor().instrument(tracer_provider=trace_provider)
Create LangChain Components
Set up your LangChain client with built-in observability:
from langchain_openai import ChatOpenAI
from langchain_core.prompts import ChatPromptTemplate
prompt = ChatPromptTemplate.from_template("{x} {y} {z}?").partial(x="why is", z="blue")
chain = prompt | ChatOpenAI(model_name="gpt-3.5-turbo")
def run_chain():
try:
result = chain.invoke({"y": "sky"})
print(f"Response: {result}")
except Exception as e:
print(f"Error executing chain: {e}")
if __name__ == "__main__":
run_chain()
LangGraph Instrumentation
LangGraph instrumentation provides comprehensive tracing for graph-based workflows, including:
- Graph Topology Capture: Automatically captures graph structure (nodes, edges, entry points)
- Node Execution Tracing: Tracks each node execution with state transitions
- Conditional Edge Decisions: Records branch decisions at routing nodes
- State Transition Tracking: Tracks state changes through graph execution with diffs
- Performance Metrics: Duration, memory usage, and execution counts
- Memory Tracking: Detects potential memory leaks in state growth
LangGraph Quickstart
from typing import TypedDict
from fi_instrumentation import register
from fi_instrumentation.fi_types import ProjectType
from langgraph.graph import StateGraph, END
from traceai_langchain import LangChainInstrumentor, LangGraphInstrumentor
# Define state schema
class MyState(TypedDict):
messages: list
current_step: str
# Register trace provider
trace_provider = register(
project_type=ProjectType.OBSERVE,
project_name="langgraph_app",
session_name="workflow"
)
# Instrument both LangChain and LangGraph
LangChainInstrumentor().instrument(tracer_provider=trace_provider)
LangGraphInstrumentor().instrument(tracer_provider=trace_provider)
# Build your graph
workflow = StateGraph(MyState)
workflow.add_node("process", process_node)
workflow.add_node("analyze", analyze_node)
workflow.add_edge("process", "analyze")
workflow.add_edge("analyze", END)
workflow.set_entry_point("process")
# Compile and run - traces are automatically captured
app = workflow.compile()
result = app.invoke({"messages": [], "current_step": "start"})
LangGraph Span Attributes
The instrumentation captures rich attributes for debugging and monitoring:
Graph Structure
langgraph.graph.name- Graph namelanggraph.graph.node_count- Number of nodeslanggraph.graph.edge_count- Number of edgeslanggraph.graph.topology- Full topology as JSONlanggraph.graph.entry_point- Entry point node
Node Execution
langgraph.node.name- Node namelanggraph.node.type- Node type (start/end/intermediate)langgraph.node.is_entry- Whether this is the entry nodelanggraph.node.is_end- Whether this is an end node
State Management
langgraph.state.input- Input state (JSON)langgraph.state.output- Output state (JSON)langgraph.state.updates- State updates from nodelanggraph.state.changed_fields- List of changed fieldslanggraph.state.diff- State diff (JSON)
Conditional Edges
langgraph.conditional.source- Source nodelanggraph.conditional.result- Selected branchlanggraph.conditional.available_branches- All available branches
Performance
langgraph.perf.duration_ms- Execution durationlanggraph.perf.node- Node name for performance metric
Memory Tracking
langgraph.memory.state_size_bytes- Current state sizelanggraph.memory.peak_bytes- Peak memory usagelanggraph.memory.growth_warning- Memory growth alert
Advanced: Accessing State and Topology
# Get the instrumentor instance
instrumentor = LangGraphInstrumentor()
# After graph execution, access captured data
topology = instrumentor.get_topology()
if topology:
print(f"Nodes: {topology.nodes}")
print(f"Edges: {topology.edges}")
print(f"Entry point: {topology.entry_point}")
# Get state transition history
history = instrumentor.get_state_history()
for transition in history:
print(f"Node: {transition['node']}")
print(f"Diff: {transition['diff']}")
# Get memory statistics
memory_stats = instrumentor.get_memory_stats()
print(f"Peak memory: {memory_stats.get('peak_bytes', 0)} bytes")
Examples
LangChain Examples
examples/chat_prompt_template.py- Basic chat prompt usageexamples/rag.py- Retrieval-augmented generationexamples/tool_calling_agent.py- Agent with toolsexamples/openai_chat_stream.py- Streaming responses
LangGraph Examples
examples/langgraph_simple_workflow.py- Simple state machine workflowexamples/langgraph_agent_supervisor.py- Multi-agent supervisor pattern
API Reference
LangChainInstrumentor
from traceai_langchain import LangChainInstrumentor
# Initialize and instrument
instrumentor = LangChainInstrumentor()
instrumentor.instrument(tracer_provider=trace_provider)
# Get current span
span = instrumentor.get_span(run_id)
# Get ancestor spans
ancestors = instrumentor.get_ancestors(run_id)
LangGraphInstrumentor
from traceai_langchain import LangGraphInstrumentor
# Initialize and instrument
instrumentor = LangGraphInstrumentor()
instrumentor.instrument(
tracer_provider=trace_provider,
enable_memory_tracking=True, # default: True
max_state_history=100, # default: 100
)
# Check instrumentation status
is_active = instrumentor.is_instrumented
# Get captured topology
topology = instrumentor.get_topology()
# Get state history
history = instrumentor.get_state_history()
# Get memory statistics
stats = instrumentor.get_memory_stats()
# Uninstrument when done
instrumentor.uninstrument()
LangGraphAttributes
All span attributes are defined as constants:
from traceai_langchain import LangGraphAttributes
# Use in custom spans or queries
attrs = {
LangGraphAttributes.NODE_NAME: "my_node",
LangGraphAttributes.EXECUTION_MODE: "invoke",
}
Troubleshooting
LangGraph not being traced
- Ensure you've installed the langgraph extra:
pip install traceAI-langchain[langgraph] - Initialize
LangGraphInstrumentorbefore creating any graphs - Check that
langgraphis installed:pip show langgraph
State diffs not appearing
State diffs only appear when there are actual changes between before and after states. If a node returns the same state, no diff will be recorded.
Memory tracking overhead
If memory tracking causes performance issues, disable it:
LangGraphInstrumentor().instrument(enable_memory_tracking=False)
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file traceai_langchain-0.1.12.tar.gz.
File metadata
- Download URL: traceai_langchain-0.1.12.tar.gz
- Upload date:
- Size: 40.7 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: uv/0.9.21 {"installer":{"name":"uv","version":"0.9.21","subcommand":["publish"]},"python":null,"implementation":{"name":null,"version":null},"distro":{"name":"macOS","version":null,"id":null,"libc":null},"system":{"name":null,"release":null},"cpu":null,"openssl_version":null,"setuptools_version":null,"rustc_version":null,"ci":null}
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
c0af84aeada80bb6aadca92602390ab3672c060cd8bedb95fa6341d895871dfb
|
|
| MD5 |
136cc29d526c989ef3f9c26c95ba2294
|
|
| BLAKE2b-256 |
b3a051ac9dd05c0a4aab1bc6d3e2c1a40215bf82c97449d5b175f615982a6ef1
|
File details
Details for the file traceai_langchain-0.1.12-py3-none-any.whl.
File metadata
- Download URL: traceai_langchain-0.1.12-py3-none-any.whl
- Upload date:
- Size: 47.5 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: uv/0.9.21 {"installer":{"name":"uv","version":"0.9.21","subcommand":["publish"]},"python":null,"implementation":{"name":null,"version":null},"distro":{"name":"macOS","version":null,"id":null,"libc":null},"system":{"name":null,"release":null},"cpu":null,"openssl_version":null,"setuptools_version":null,"rustc_version":null,"ci":null}
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
3cf0749ea4eaafb6995145f5b6b1e2bce18028f1da45f6c7dddf221a2ed03a08
|
|
| MD5 |
a1480759160f00d238504ee6cc4a8362
|
|
| BLAKE2b-256 |
ecbb0b70a69f66ad24a6910db965724a785ab0e04afa4a4f19d8e006e58d713e
|