Agentic development kit — LLM tracing, tool management, and agent primitives
Project description
lightrace-python
Lightweight LLM tracing SDK for Python with remote tool invocation.
Install
pip install lightrace
Quick Start
from lightrace import Lightrace, trace
lt = Lightrace(
public_key="pk-lt-demo",
secret_key="sk-lt-demo",
host="http://localhost:3000",
)
# Root trace
@trace()
def run_agent(query: str):
return search(query)
# Span
@trace(type="span")
def search(query: str) -> list:
return ["result1", "result2"]
# Generation (LLM call)
@trace(type="generation", model="gpt-4o")
def generate(prompt: str) -> str:
return "LLM response"
# Tool — remotely invocable from the Lightrace UI
@trace(type="tool")
def weather_lookup(city: str) -> dict:
return {"temp": 72, "unit": "F"}
# Tool — traced but NOT remotely invocable
@trace(type="tool", invoke=False)
def read_file(path: str) -> str:
return open(path).read()
run_agent("hello")
lt.flush()
lt.shutdown()
@trace API
@trace() # Root trace
@trace(type="span") # Span observation
@trace(type="generation", model="gpt-4o") # LLM generation
@trace(type="tool") # Tool (remotely invocable)
@trace(type="tool", invoke=False) # Tool (trace only)
Parameters
| Parameter | Type | Default | Description |
|---|---|---|---|
type |
str |
None |
"span", "generation", "tool", "chain", "event" |
name |
str |
None |
Override name (defaults to function name) |
invoke |
bool |
True |
For type="tool": register for remote invocation |
model |
str |
None |
For type="generation": LLM model name |
metadata |
dict |
None |
Static metadata attached to every call |
Integrations
OpenAI
import openai
from lightrace import Lightrace, trace
from lightrace.integrations.openai import LightraceOpenAIInstrumentor
lt = Lightrace(
public_key="pk-lt-demo",
secret_key="sk-lt-demo",
host="http://localhost:3000",
)
client = openai.OpenAI()
instrumentor = LightraceOpenAIInstrumentor(client=lt)
instrumentor.instrument(client)
@trace()
def ask_gpt():
response = client.chat.completions.create(
model="gpt-4o-mini",
max_tokens=256,
messages=[{"role": "user", "content": "What is the speed of light?"}],
)
return response.choices[0].message.content
ask_gpt()
lt.flush()
lt.shutdown()
Anthropic
import anthropic
from lightrace import Lightrace, trace
from lightrace.integrations.anthropic import LightraceAnthropicInstrumentor
lt = Lightrace(
public_key="pk-lt-demo",
secret_key="sk-lt-demo",
host="http://localhost:3000",
)
client = anthropic.Anthropic()
instrumentor = LightraceAnthropicInstrumentor(client=lt)
instrumentor.instrument(client)
@trace()
def ask_claude():
response = client.messages.create(
model="claude-sonnet-4-20250514",
max_tokens=256,
messages=[{"role": "user", "content": "What is the capital of Mongolia?"}],
)
return response.content[0].text
ask_claude()
lt.flush()
lt.shutdown()
LangChain
from langchain_core.messages import HumanMessage
from langchain_openai import ChatOpenAI
from lightrace import Lightrace
from lightrace.integrations.langchain import LightraceCallbackHandler
lt = Lightrace(
public_key="pk-lt-demo",
secret_key="sk-lt-demo",
host="http://localhost:3000",
)
handler = LightraceCallbackHandler(client=lt)
model = ChatOpenAI(model="gpt-4o-mini", max_tokens=256)
response = model.invoke(
[HumanMessage(content="What is the speed of light?")],
config={"callbacks": [handler]},
)
lt.flush()
lt.shutdown()
Claude Agent SDK
import anyio
from claude_agent_sdk import AssistantMessage, ClaudeAgentOptions, ResultMessage, TextBlock
from lightrace import Lightrace
from lightrace.integrations.claude_agent_sdk import traced_query
lt = Lightrace(
public_key="pk-lt-demo",
secret_key="sk-lt-demo",
host="http://localhost:3000",
)
async def main():
async for message in traced_query(
prompt="What files are in the current directory?",
options=ClaudeAgentOptions(max_turns=3),
client=lt,
trace_name="file-lister",
):
if isinstance(message, AssistantMessage):
for block in message.content:
if isinstance(block, TextBlock):
print(block.text)
elif isinstance(message, ResultMessage):
print(f"Cost: ${message.total_cost_usd:.4f}")
lt.flush()
lt.shutdown()
anyio.run(main)
You can also use the handler directly for more control:
from claude_agent_sdk import query
from lightrace.integrations.claude_agent_sdk import LightraceAgentHandler
handler = LightraceAgentHandler(prompt="Hello", client=lt, trace_name="my-agent")
async for message in query(prompt="Hello"):
handler.handle(message)
Compatibility
Lightrace server also accepts traces from Langfuse Python/JS SDKs.
Related
- Lightrace — the main platform (backend + frontend)
- Lightrace CLI — self-host with a single command
- lightrace-js — TypeScript/JavaScript SDK
Development
uv sync --extra dev
uv run pre-commit install
uv run pytest -s -v tests/
uv run ruff check .
uv run mypy src/lightrace
License
MIT
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file lightrace-1.0.8.tar.gz.
File metadata
- Download URL: lightrace-1.0.8.tar.gz
- Upload date:
- Size: 388.7 kB
- Tags: Source
- Uploaded using Trusted Publishing? Yes
- Uploaded via: twine/6.1.0 CPython/3.13.7
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
6514e30e49da5294b26ca059cf0a804fffbb9203774fb88d65d6decd8e3d6986
|
|
| MD5 |
a42a7805027217903617bd25345668ab
|
|
| BLAKE2b-256 |
d3c4e2993ad09dd1c7e64d965b30c5413392c96d3be2992235c42501de4311fc
|
Provenance
The following attestation bundles were made for lightrace-1.0.8.tar.gz:
Publisher:
release.yml on SKE-Labs/lightrace-python
-
Statement:
-
Statement type:
https://in-toto.io/Statement/v1 -
Predicate type:
https://docs.pypi.org/attestations/publish/v1 -
Subject name:
lightrace-1.0.8.tar.gz -
Subject digest:
6514e30e49da5294b26ca059cf0a804fffbb9203774fb88d65d6decd8e3d6986 - Sigstore transparency entry: 1245957044
- Sigstore integration time:
-
Permalink:
SKE-Labs/lightrace-python@d7987f00a1b84c3ec88937955ebbcfaad1c62ac2 -
Branch / Tag:
refs/tags/v1.0.8 - Owner: https://github.com/SKE-Labs
-
Access:
public
-
Token Issuer:
https://token.actions.githubusercontent.com -
Runner Environment:
github-hosted -
Publication workflow:
release.yml@d7987f00a1b84c3ec88937955ebbcfaad1c62ac2 -
Trigger Event:
push
-
Statement type:
File details
Details for the file lightrace-1.0.8-py3-none-any.whl.
File metadata
- Download URL: lightrace-1.0.8-py3-none-any.whl
- Upload date:
- Size: 42.2 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? Yes
- Uploaded via: twine/6.1.0 CPython/3.13.7
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
49227578a18e850816429c95b4c830ebe8068ef749300efe59118c840f7a37dd
|
|
| MD5 |
c938b704e8bdf804970ba9de9416c24a
|
|
| BLAKE2b-256 |
6c6e7fff2a7ec5e8f7c8690618a66ca14d7ec6127ae2f282ece13a068e24409a
|
Provenance
The following attestation bundles were made for lightrace-1.0.8-py3-none-any.whl:
Publisher:
release.yml on SKE-Labs/lightrace-python
-
Statement:
-
Statement type:
https://in-toto.io/Statement/v1 -
Predicate type:
https://docs.pypi.org/attestations/publish/v1 -
Subject name:
lightrace-1.0.8-py3-none-any.whl -
Subject digest:
49227578a18e850816429c95b4c830ebe8068ef749300efe59118c840f7a37dd - Sigstore transparency entry: 1245957045
- Sigstore integration time:
-
Permalink:
SKE-Labs/lightrace-python@d7987f00a1b84c3ec88937955ebbcfaad1c62ac2 -
Branch / Tag:
refs/tags/v1.0.8 - Owner: https://github.com/SKE-Labs
-
Access:
public
-
Token Issuer:
https://token.actions.githubusercontent.com -
Runner Environment:
github-hosted -
Publication workflow:
release.yml@d7987f00a1b84c3ec88937955ebbcfaad1c62ac2 -
Trigger Event:
push
-
Statement type: