Skip to main content

Agentic development kit — LLM tracing, tool management, and agent primitives

Project description

LightRace

lightrace-python

PyPI version GitHub stars License

Lightweight LLM tracing SDK for Python with remote tool invocation.


Install

pip install lightrace

Quick Start

from lightrace import Lightrace, trace

lt = Lightrace(
    public_key="pk-lt-demo",
    secret_key="sk-lt-demo",
    host="http://localhost:3000",
)

# Root trace
@trace()
def run_agent(query: str):
    return search(query)

# Span
@trace(type="span")
def search(query: str) -> list:
    return ["result1", "result2"]

# Generation (LLM call)
@trace(type="generation", model="gpt-4o")
def generate(prompt: str) -> str:
    return "LLM response"

# Tool — remotely invocable from the Lightrace UI
@trace(type="tool")
def weather_lookup(city: str) -> dict:
    return {"temp": 72, "unit": "F"}

# Tool — traced but NOT remotely invocable
@trace(type="tool", invoke=False)
def read_file(path: str) -> str:
    return open(path).read()

run_agent("hello")
lt.flush()
lt.shutdown()

@trace API

@trace()                                    # Root trace
@trace(type="span")                         # Span observation
@trace(type="generation", model="gpt-4o")   # LLM generation
@trace(type="tool")                         # Tool (remotely invocable)
@trace(type="tool", invoke=False)           # Tool (trace only)

Parameters

Parameter Type Default Description
type str None "span", "generation", "tool", "chain", "event"
name str None Override name (defaults to function name)
invoke bool True For type="tool": register for remote invocation
model str None For type="generation": LLM model name
metadata dict None Static metadata attached to every call

Integrations

OpenAI

import openai
from lightrace import Lightrace, trace
from lightrace.integrations.openai import LightraceOpenAIInstrumentor

lt = Lightrace(
    public_key="pk-lt-demo",
    secret_key="sk-lt-demo",
    host="http://localhost:3000",
)

client = openai.OpenAI()
instrumentor = LightraceOpenAIInstrumentor(client=lt)
instrumentor.instrument(client)

@trace()
def ask_gpt():
    response = client.chat.completions.create(
        model="gpt-4o-mini",
        max_tokens=256,
        messages=[{"role": "user", "content": "What is the speed of light?"}],
    )
    return response.choices[0].message.content

ask_gpt()
lt.flush()
lt.shutdown()

Anthropic

import anthropic
from lightrace import Lightrace, trace
from lightrace.integrations.anthropic import LightraceAnthropicInstrumentor

lt = Lightrace(
    public_key="pk-lt-demo",
    secret_key="sk-lt-demo",
    host="http://localhost:3000",
)

client = anthropic.Anthropic()
instrumentor = LightraceAnthropicInstrumentor(client=lt)
instrumentor.instrument(client)

@trace()
def ask_claude():
    response = client.messages.create(
        model="claude-sonnet-4-20250514",
        max_tokens=256,
        messages=[{"role": "user", "content": "What is the capital of Mongolia?"}],
    )
    return response.content[0].text

ask_claude()
lt.flush()
lt.shutdown()

LangChain

from langchain_core.messages import HumanMessage
from langchain_openai import ChatOpenAI
from lightrace import Lightrace
from lightrace.integrations.langchain import LightraceCallbackHandler

lt = Lightrace(
    public_key="pk-lt-demo",
    secret_key="sk-lt-demo",
    host="http://localhost:3000",
)

handler = LightraceCallbackHandler(client=lt)
model = ChatOpenAI(model="gpt-4o-mini", max_tokens=256)

response = model.invoke(
    [HumanMessage(content="What is the speed of light?")],
    config={"callbacks": [handler]},
)

lt.flush()
lt.shutdown()

Claude Agent SDK

import anyio
from claude_agent_sdk import AssistantMessage, ClaudeAgentOptions, ResultMessage, TextBlock
from lightrace import Lightrace
from lightrace.integrations.claude_agent_sdk import traced_query

lt = Lightrace(
    public_key="pk-lt-demo",
    secret_key="sk-lt-demo",
    host="http://localhost:3000",
)

async def main():
    async for message in traced_query(
        prompt="What files are in the current directory?",
        options=ClaudeAgentOptions(max_turns=3),
        client=lt,
        trace_name="file-lister",
    ):
        if isinstance(message, AssistantMessage):
            for block in message.content:
                if isinstance(block, TextBlock):
                    print(block.text)
        elif isinstance(message, ResultMessage):
            print(f"Cost: ${message.total_cost_usd:.4f}")

    lt.flush()
    lt.shutdown()

anyio.run(main)

You can also use the handler directly for more control:

from claude_agent_sdk import query
from lightrace.integrations.claude_agent_sdk import LightraceAgentHandler

handler = LightraceAgentHandler(prompt="Hello", client=lt, trace_name="my-agent")

async for message in query(prompt="Hello"):
    handler.handle(message)

Compatibility

Lightrace server also accepts traces from Langfuse Python/JS SDKs.

Related

Development

uv sync --extra dev
uv run pre-commit install
uv run pytest -s -v tests/
uv run ruff check .
uv run mypy src/lightrace

License

MIT

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

lightrace-1.0.12.tar.gz (400.3 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

lightrace-1.0.12-py3-none-any.whl (48.7 kB view details)

Uploaded Python 3

File details

Details for the file lightrace-1.0.12.tar.gz.

File metadata

  • Download URL: lightrace-1.0.12.tar.gz
  • Upload date:
  • Size: 400.3 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.13.12

File hashes

Hashes for lightrace-1.0.12.tar.gz
Algorithm Hash digest
SHA256 4ce7c62cac475aaea33f7989240c735b5257c77d5bd9e74e0d52804176686ff4
MD5 99745fcb600d514835af9c8b2a5ac704
BLAKE2b-256 4c207b6927001ff3ea9cf9f839a038348704208f9e70d60dcf7491d33231bebc

See more details on using hashes here.

Provenance

The following attestation bundles were made for lightrace-1.0.12.tar.gz:

Publisher: release.yml on SKE-Labs/lightrace-python

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file lightrace-1.0.12-py3-none-any.whl.

File metadata

  • Download URL: lightrace-1.0.12-py3-none-any.whl
  • Upload date:
  • Size: 48.7 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.13.12

File hashes

Hashes for lightrace-1.0.12-py3-none-any.whl
Algorithm Hash digest
SHA256 8c37ff9d8b0295b787ccf96800d19536878424ec7c161d88781f76d12a5a585c
MD5 e1447c61901f98759bf1a1d806a31edd
BLAKE2b-256 b43c2a9cea562cca1d7cd05b7da8c1c747dedf2b6fc14d7fd7ea400cc0b348cc

See more details on using hashes here.

Provenance

The following attestation bundles were made for lightrace-1.0.12-py3-none-any.whl:

Publisher: release.yml on SKE-Labs/lightrace-python

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page