Agentic development kit — LLM tracing, tool management, and agent primitives
Project description
lightrace-python
Lightweight LLM tracing SDK for Python with remote tool invocation.
Install
pip install lightrace
Quick Start
from lightrace import Lightrace, trace
lt = Lightrace(
public_key="pk-lt-demo",
secret_key="sk-lt-demo",
host="http://localhost:3000",
)
# Root trace
@trace()
def run_agent(query: str):
return search(query)
# Span
@trace(type="span")
def search(query: str) -> list:
return ["result1", "result2"]
# Generation (LLM call)
@trace(type="generation", model="gpt-4o")
def generate(prompt: str) -> str:
return "LLM response"
# Tool — remotely invocable from the Lightrace UI
@trace(type="tool")
def weather_lookup(city: str) -> dict:
return {"temp": 72, "unit": "F"}
# Tool — traced but NOT remotely invocable
@trace(type="tool", invoke=False)
def read_file(path: str) -> str:
return open(path).read()
run_agent("hello")
lt.flush()
lt.shutdown()
@trace API
@trace() # Root trace
@trace(type="span") # Span observation
@trace(type="generation", model="gpt-4o") # LLM generation
@trace(type="tool") # Tool (remotely invocable)
@trace(type="tool", invoke=False) # Tool (trace only)
Parameters
| Parameter | Type | Default | Description |
|---|---|---|---|
type |
str |
None |
"span", "generation", "tool", "chain", "event" |
name |
str |
None |
Override name (defaults to function name) |
invoke |
bool |
True |
For type="tool": register for remote invocation |
model |
str |
None |
For type="generation": LLM model name |
metadata |
dict |
None |
Static metadata attached to every call |
Lightrace() Constructor
| Parameter | Type | Default | Description |
|---|---|---|---|
tools |
list |
None |
LangChain tools or callables to register for dashboard re-invocation |
context |
dict[str, (getter, setter)] |
None |
Context variables for automatic capture/restore during fork |
from lightrace import Lightrace
lt = Lightrace(
public_key="pk-lt-demo",
secret_key="sk-lt-demo",
host="http://localhost:3000",
tools=[get_weather, calculate], # register tools in one step
context={ # register context vars in one step
"user_id": (get_user_id, set_user_id),
"session_id": (get_session, set_session),
},
)
Integrations
OpenAI
import openai
from lightrace import Lightrace, trace
from lightrace.integrations.openai import LightraceOpenAIInstrumentor
lt = Lightrace(
public_key="pk-lt-demo",
secret_key="sk-lt-demo",
host="http://localhost:3000",
)
client = openai.OpenAI()
instrumentor = LightraceOpenAIInstrumentor(client=lt)
instrumentor.instrument(client)
@trace()
def ask_gpt():
response = client.chat.completions.create(
model="gpt-4o-mini",
max_tokens=256,
messages=[{"role": "user", "content": "What is the speed of light?"}],
)
return response.choices[0].message.content
ask_gpt()
lt.flush()
lt.shutdown()
Anthropic
import anthropic
from lightrace import Lightrace, trace
from lightrace.integrations.anthropic import LightraceAnthropicInstrumentor
lt = Lightrace(
public_key="pk-lt-demo",
secret_key="sk-lt-demo",
host="http://localhost:3000",
)
client = anthropic.Anthropic()
instrumentor = LightraceAnthropicInstrumentor(client=lt)
instrumentor.instrument(client)
@trace()
def ask_claude():
response = client.messages.create(
model="claude-sonnet-4-20250514",
max_tokens=256,
messages=[{"role": "user", "content": "What is the capital of Mongolia?"}],
)
return response.content[0].text
ask_claude()
lt.flush()
lt.shutdown()
LangChain
from langchain_core.messages import HumanMessage
from langchain_openai import ChatOpenAI
from lightrace import Lightrace
from lightrace.integrations.langchain import LightraceCallbackHandler
lt = Lightrace(
public_key="pk-lt-demo",
secret_key="sk-lt-demo",
host="http://localhost:3000",
)
handler = LightraceCallbackHandler(client=lt)
model = ChatOpenAI(model="gpt-4o-mini", max_tokens=256)
response = model.invoke(
[HumanMessage(content="What is the speed of light?")],
config={"callbacks": [handler]},
)
lt.flush()
lt.shutdown()
LangGraph Fork / Replay
Fork lets you answer "what if this tool returned something different?" by forking a LangGraph execution from any tool checkpoint and continuing with modified output.
import asyncio
from langchain_anthropic import ChatAnthropic
from langchain_core.tools import tool
from langgraph.checkpoint.memory import MemorySaver
from langgraph.prebuilt import create_react_agent
from lightrace import Lightrace
from lightrace.integrations.langchain import LightraceCallbackHandler
@tool
def get_weather(city: str) -> str:
"""Get the current weather for a city."""
return "72F, sunny"
# Pass tools= to register them for dashboard re-invocation
lt = Lightrace(
public_key="pk-lt-demo",
secret_key="sk-lt-demo",
host="http://localhost:3000",
tools=[get_weather],
)
# Checkpointer is required for fork
agent = create_react_agent(
ChatAnthropic(model="claude-sonnet-4-20250514"),
[get_weather],
checkpointer=MemorySaver(),
)
async def main():
thread_id = "demo-thread"
handler = LightraceCallbackHandler(
client=lt,
session_id=thread_id,
trace_name="weather-agent",
configurable={"thread_id": thread_id},
)
await agent.ainvoke(
{"messages": [("user", "What's the weather in Tokyo?")]},
config={"configurable": {"thread_id": thread_id}, "callbacks": [handler]},
)
# Register the graph for fork/replay from the dashboard
lt.register_graph(agent, event_loop=asyncio.get_running_loop())
lt.flush()
asyncio.run(main())
Requirements for fork:
- Graph must have a checkpointer (
MemorySaver,AsyncPostgresSaver, etc.) - Call
lt.register_graph(agent)to enable fork from the dashboard - Pass
tools=[...]to the constructor (or calllt.register_tools(...)) so tools can be re-invoked - Pass
session_id=thread_idandconfigurable={"thread_id": ...}to the callback handler
Claude Agent SDK
import anyio
from claude_agent_sdk import AssistantMessage, ClaudeAgentOptions, ResultMessage, TextBlock
from lightrace import Lightrace
from lightrace.integrations.claude_agent_sdk import traced_query
lt = Lightrace(
public_key="pk-lt-demo",
secret_key="sk-lt-demo",
host="http://localhost:3000",
)
async def main():
async for message in traced_query(
prompt="What files are in the current directory?",
options=ClaudeAgentOptions(max_turns=3),
client=lt,
trace_name="file-lister",
):
if isinstance(message, AssistantMessage):
for block in message.content:
if isinstance(block, TextBlock):
print(block.text)
elif isinstance(message, ResultMessage):
print(f"Cost: ${message.total_cost_usd:.4f}")
lt.flush()
lt.shutdown()
anyio.run(main)
You can also use the handler directly for more control:
from claude_agent_sdk import query
from lightrace.integrations.claude_agent_sdk import LightraceAgentHandler
handler = LightraceAgentHandler(prompt="Hello", client=lt, trace_name="my-agent")
async for message in query(prompt="Hello"):
handler.handle(message)
Compatibility
Lightrace server also accepts traces from Langfuse Python/JS SDKs.
Related
- Lightrace — the main platform (backend + frontend)
- Lightrace CLI — self-host with a single command
- lightrace-js — TypeScript/JavaScript SDK
Development
uv sync --extra dev
uv run pre-commit install
uv run pytest -s -v tests/
uv run ruff check .
uv run mypy src/lightrace
License
MIT
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file lightrace-1.0.14.tar.gz.
File metadata
- Download URL: lightrace-1.0.14.tar.gz
- Upload date:
- Size: 404.7 kB
- Tags: Source
- Uploaded using Trusted Publishing? Yes
- Uploaded via: twine/6.1.0 CPython/3.13.12
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
1159abbd7658fcb3c1d8074c27b299e539bbfbab19b2ed6a0d1c0dd1be3e9670
|
|
| MD5 |
897157204af5de9e459b9d72d6090d24
|
|
| BLAKE2b-256 |
4afa55d4884c233bdf6acc66116ab3186f9ba806e7bc2a918f4249f9da5f77fa
|
Provenance
The following attestation bundles were made for lightrace-1.0.14.tar.gz:
Publisher:
release.yml on SKE-Labs/lightrace-python
-
Statement:
-
Statement type:
https://in-toto.io/Statement/v1 -
Predicate type:
https://docs.pypi.org/attestations/publish/v1 -
Subject name:
lightrace-1.0.14.tar.gz -
Subject digest:
1159abbd7658fcb3c1d8074c27b299e539bbfbab19b2ed6a0d1c0dd1be3e9670 - Sigstore transparency entry: 1316940348
- Sigstore integration time:
-
Permalink:
SKE-Labs/lightrace-python@aac5a4fc5cbe4cb0d6217a2a732bba824b99c33f -
Branch / Tag:
refs/tags/v1.0.14 - Owner: https://github.com/SKE-Labs
-
Access:
public
-
Token Issuer:
https://token.actions.githubusercontent.com -
Runner Environment:
github-hosted -
Publication workflow:
release.yml@aac5a4fc5cbe4cb0d6217a2a732bba824b99c33f -
Trigger Event:
push
-
Statement type:
File details
Details for the file lightrace-1.0.14-py3-none-any.whl.
File metadata
- Download URL: lightrace-1.0.14-py3-none-any.whl
- Upload date:
- Size: 52.7 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? Yes
- Uploaded via: twine/6.1.0 CPython/3.13.12
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
8118343382514cbfa3d91dd32c4d408c2174ad0f0ea7c79accfb637e87dc6e25
|
|
| MD5 |
1062e9e6dcc323717e7380e0ed9b840f
|
|
| BLAKE2b-256 |
cc8a8bbc864d19e1be27e14e02f4761a92913e7daa3041556bb9a924009c47ca
|
Provenance
The following attestation bundles were made for lightrace-1.0.14-py3-none-any.whl:
Publisher:
release.yml on SKE-Labs/lightrace-python
-
Statement:
-
Statement type:
https://in-toto.io/Statement/v1 -
Predicate type:
https://docs.pypi.org/attestations/publish/v1 -
Subject name:
lightrace-1.0.14-py3-none-any.whl -
Subject digest:
8118343382514cbfa3d91dd32c4d408c2174ad0f0ea7c79accfb637e87dc6e25 - Sigstore transparency entry: 1316940353
- Sigstore integration time:
-
Permalink:
SKE-Labs/lightrace-python@aac5a4fc5cbe4cb0d6217a2a732bba824b99c33f -
Branch / Tag:
refs/tags/v1.0.14 - Owner: https://github.com/SKE-Labs
-
Access:
public
-
Token Issuer:
https://token.actions.githubusercontent.com -
Runner Environment:
github-hosted -
Publication workflow:
release.yml@aac5a4fc5cbe4cb0d6217a2a732bba824b99c33f -
Trigger Event:
push
-
Statement type: