Skip to main content

OpenTelemetry instrumentation for LangGraph.

Project description

LLM Tracekit - LangGraph

OpenTelemetry instrumentation for LangGraph, focused on span structure and node attributes for graph runs. Use it together with LangChain, OpenAI, or other LLM instrumentors for full observability.

Span structure (3 levels)

  1. Global span — One per graph invocation. Starts when execution leaves START and ends when it reaches END. Span name: "LangGraph".
  2. Node spans — One per graph node execution, as children of the global span. Span name: "LangGraph Node <node_name>". Each node span has two attributes: node name (gen_ai.langgraph.node) and step number (gen_ai.langgraph.step, when provided by LangGraph). The node span is the current span while the node runs, so any LLM calls inside the node are traced by other instrumentors as children of that node span. Tool nodes (nodes that only run tools and do not call an LLM) get a node span too; they have no LLM child spans.
  3. LLM spans — Created by other instrumentors (LangChain, OpenAI, Gemini, etc.) when a node calls an LLM. They appear as children of the corresponding node span.

Resulting trace: LangGraphLangGraph Node …chat/completion (from LangChain/OpenAI/etc.) where the node runs an LLM; tool-only nodes appear as LangGraph Node <name> with no child.

Installation

pip install "llm-tracekit-langgraph"

Usage

Setting up tracing

You can use the setup_export_to_coralogix function to setup tracing and export traces to Coralogix:

from llm_tracekit.langgraph import setup_export_to_coralogix

setup_export_to_coralogix(
    service_name="ai-service",
    application_name="ai-application",
    subsystem_name="ai-subsystem",
)

Alternatively, set up tracing manually with your preferred TracerProvider and exporter.

Activation

To instrument all LangGraph runs that use LangChain's callback manager:

from llm_tracekit.langgraph import LangGraphInstrumentor

LangGraphInstrumentor().instrument()

Capturing LLM call spans

This instrumentor only creates the graph-level and node-level spans above. It does not create spans for LLM calls. To get LLM spans (model, token usage, tool calls, etc.) as children of the node span that runs the LLM:

  • Use LangChain: llm-tracekit-langchain and LangChainInstrumentor().instrument(...) in addition to LangGraphInstrumentor().instrument(...). Both can run together; LangChain will create child spans under the current (node) span.
  • Or use provider-specific instrumentors (OpenAI, Bedrock, etc.) instead of or alongside LangChain.

Install and activate the extra instrumentor(s) you need. The same tracer provider can be passed to all of them. LLM spans will appear under the correct node span because the node span is set as the current span while the node runs.

Uninstrument

LangGraphInstrumentor().uninstrument()

Full example

Minimal graph (no LLM):

from langgraph.graph import StateGraph, START, END
from langgraph.checkpoint.memory import MemorySaver

from llm_tracekit.langgraph import LangGraphInstrumentor, setup_export_to_coralogix

setup_export_to_coralogix(service_name="ai-service")

LangGraphInstrumentor().instrument()

def node_a(state: dict) -> dict:
    return {"messages": state.get("messages", []) + ["A"]}

def node_b(state: dict) -> dict:
    return {"messages": state.get("messages", []) + ["B"]}

graph = StateGraph(dict)
graph.add_node("a", node_a)
graph.add_node("b", node_b)
graph.add_edge(START, "a")
graph.add_edge("a", "b")
graph.add_edge("b", END)

app = graph.compile(checkpointer=MemorySaver())
result = app.invoke({"messages": []}, config={"configurable": {"thread_id": "1"}})

Manual handler

You can also add the handler explicitly when invoking a graph (e.g. for testing or when not using the instrumentor):

from llm_tracekit.langgraph.callback import LangGraphCallbackHandler

tracer = tracer_provider.get_tracer(__name__)
handler = LangGraphCallbackHandler(tracer=tracer)
result = app.invoke(initial_state, config={"callbacks": [handler], "configurable": {"thread_id": "1"}})

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

llm_tracekit_langgraph-1.2.0.tar.gz (10.7 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

llm_tracekit_langgraph-1.2.0-py3-none-any.whl (15.3 kB view details)

Uploaded Python 3

File details

Details for the file llm_tracekit_langgraph-1.2.0.tar.gz.

File metadata

  • Download URL: llm_tracekit_langgraph-1.2.0.tar.gz
  • Upload date:
  • Size: 10.7 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: uv/0.11.3 {"installer":{"name":"uv","version":"0.11.3","subcommand":["publish"]},"python":null,"implementation":{"name":null,"version":null},"distro":{"name":"Ubuntu","version":"24.04","id":"noble","libc":null},"system":{"name":null,"release":null},"cpu":null,"openssl_version":null,"setuptools_version":null,"rustc_version":null,"ci":true}

File hashes

Hashes for llm_tracekit_langgraph-1.2.0.tar.gz
Algorithm Hash digest
SHA256 1860bb57f2e31f4297c7de5fb097e7b6344269bfbd077bbe790ee81fe40f5e9b
MD5 9bbb952fb4b08398212532a19885c43a
BLAKE2b-256 465dff56d1781a24badb27f434d233a65dc590043a50cd6b173f4e376e366a16

See more details on using hashes here.

File details

Details for the file llm_tracekit_langgraph-1.2.0-py3-none-any.whl.

File metadata

  • Download URL: llm_tracekit_langgraph-1.2.0-py3-none-any.whl
  • Upload date:
  • Size: 15.3 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: uv/0.11.3 {"installer":{"name":"uv","version":"0.11.3","subcommand":["publish"]},"python":null,"implementation":{"name":null,"version":null},"distro":{"name":"Ubuntu","version":"24.04","id":"noble","libc":null},"system":{"name":null,"release":null},"cpu":null,"openssl_version":null,"setuptools_version":null,"rustc_version":null,"ci":true}

File hashes

Hashes for llm_tracekit_langgraph-1.2.0-py3-none-any.whl
Algorithm Hash digest
SHA256 17800e787227e7766180bc457a4155c3b754bbf0d04935a779d392d38a8a4025
MD5 2fddb5fea994480d9ce723b948f0d388
BLAKE2b-256 5ef4c831e76b5510051eabed9c743041907929c36655b54629f45b2764935109

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page