Skip to main content

Lightweight distributed LLM execution tracer and profiler

Project description

TraceLM

TraceLM is a lightweight execution tracer and profiler for LLM pipelines, RAG systems, and FastAPI-based AI services.

It focuses on the tracing primitives that matter when you want to understand execution flow clearly: span trees, traceparent propagation, sampling, profiling, and local inspection.

Why TraceLM

TraceLM exists for developers who want observability without pulling in a full telemetry stack on day one.

  • Trace multi-step LLM workflows with explicit spans
  • Validate trace structure before profiling
  • Continue traces across service boundaries with W3C traceparent
  • Inspect latency, critical path, token usage, and cost from the CLI
  • Export traces to Chrome Trace or OpenTelemetry-compatible JSON

It is intentionally small, local-first, and easy to read.

Installation

pip install tracelm

Try it immediately:

tracelm demo
tracelm init

Optional integrations:

pip install "tracelm[fastapi]"
pip install "tracelm[requests]"
pip install "tracelm[otel]"

Quick Start

TraceLM terminal demo

The fastest path is the built-in demo:

tracelm demo
tracelm latest
tracelm export latest --format chrome

That gives you a real trace, a readable summary, and an exportable file without writing any code.

If you want a starter file generated for you:

tracelm init
tracelm run tracelm_example.py
tracelm latest

tracelm run now executes normal if __name__ == "__main__": scripts correctly, so generated starter files and standard Python entrypoints behave the way users expect.

If you want to trace your own script, create a small instrumented file:

from tracelm.decorator import node

Then run it:

tracelm run test_app.py
tracelm latest

Typical output:

Trace Summary
-------------
Trace ID: 55df12035a754aa080875618bc5794c3
Total Latency: 0.204
Total Spans: 3
Slowest Span: step2
Critical Path: __root__ -> step2
Tokens In: 0
Tokens Out: 0
Total Cost: 0
Anomalies: {'latency_spikes': ['step2']}

Duration Histogram (ms)
-----------------------
0.000-0.100: 1
0.100-0.500: 1
...

Execution Tree
--------------
__root__ (0.204 ms)
+-- step1 (0.082 ms)
\-- step2 (0.101 ms)

Features

  • Hierarchical span tracing with a synthetic root span model
  • W3C traceparent parsing and continuation
  • FastAPI middleware integration
  • Requests-based outbound propagation
  • Head-based probabilistic sampling
  • Critical path and slowest-span analysis
  • Duration histogram generation
  • Token and cost aggregation
  • Trace comparison from the CLI
  • Chrome Trace export
  • OpenTelemetry JSON export and SDK bridge
  • SQLite-backed local trace storage

FastAPI Integration

from fastapi import FastAPI
from tracelm.integrations.fastapi import TraceLMMiddleware

app = FastAPI()
app.add_middleware(TraceLMMiddleware, sample_rate=1.0)


@app.get("/")
def compute():
    return {"status": "ok"}

Import the requests integration to propagate trace context on outbound HTTP calls:

import tracelm.integrations.requests

CLI Commands

Run a Python file under tracing:

tracelm run test_app.py

Apply head sampling:

tracelm run test_app.py --sample-rate 0.1

Generate a starter example file:

tracelm init
tracelm init my_flow.py

Analyze a stored trace:

tracelm analyze <trace_id>

Analyze the most recent trace:

tracelm latest

List stored traces:

tracelm list

Compare two traces:

tracelm compare <trace_id_1> <trace_id_2>

Export to Chrome Trace:

tracelm export <trace_id> --format chrome

You can also use latest anywhere a trace ID is expected:

tracelm export latest --format chrome

Export to OTEL JSON:

tracelm export <trace_id> --format otel

Design Notes

TraceLM keeps a strict execution model:

  • One local root span for CLI-created traces
  • Or one continuation entry span when joining an external trace
  • DAG validation before profiling
  • Async-safe context propagation with ContextVar

This keeps behavior predictable and makes failures explicit.

Scope

TraceLM is currently:

  • Single-process
  • Developer-focused
  • Local-first
  • CLI-centered

It is useful for understanding and instrumenting pipelines before adopting heavier observability infrastructure.

Documentation

  • Architecture: docs/ARCHITECTURE.md
  • Contribution guide: CONTRIBUTING.md

License

MIT

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

tracelm-0.4.4.tar.gz (19.5 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

tracelm-0.4.4-py3-none-any.whl (20.0 kB view details)

Uploaded Python 3

File details

Details for the file tracelm-0.4.4.tar.gz.

File metadata

  • Download URL: tracelm-0.4.4.tar.gz
  • Upload date:
  • Size: 19.5 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.13.7

File hashes

Hashes for tracelm-0.4.4.tar.gz
Algorithm Hash digest
SHA256 409b40de1a8f73da2cefa065bbb0f3bdd4a5873de99efab18a89b0993ad24e4f
MD5 2e31294895ca11bdea3343917ea31fd2
BLAKE2b-256 61acd6e9c250f68a28c01e87162c23d968930b4379550fdaf385229bc82b0e32

See more details on using hashes here.

Provenance

The following attestation bundles were made for tracelm-0.4.4.tar.gz:

Publisher: publish.yml on td-02/tracelm

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file tracelm-0.4.4-py3-none-any.whl.

File metadata

  • Download URL: tracelm-0.4.4-py3-none-any.whl
  • Upload date:
  • Size: 20.0 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.13.7

File hashes

Hashes for tracelm-0.4.4-py3-none-any.whl
Algorithm Hash digest
SHA256 d77cc28259978a333f144dc59f85fb3b832aa5873285aa3c80b882e6835e38a0
MD5 74a125f63593c330ab7bbdb68f3ad7af
BLAKE2b-256 7bba7256b5bba21ce521066c2e7e47039470b1f87656b49250e3a8d86591e75e

See more details on using hashes here.

Provenance

The following attestation bundles were made for tracelm-0.4.4-py3-none-any.whl:

Publisher: publish.yml on td-02/tracelm

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page