Lightweight distributed LLM execution tracer and profiler
Project description
TraceLM
TraceLM is a lightweight execution tracer and profiler for LLM pipelines, RAG systems, and FastAPI-based AI services.
It focuses on the tracing primitives that matter when you want to understand execution flow clearly: span trees, traceparent propagation, sampling, profiling, and local inspection.
Why TraceLM
TraceLM exists for developers who want observability without pulling in a full telemetry stack on day one.
- Trace multi-step LLM workflows with explicit spans
- Validate trace structure before profiling
- Continue traces across service boundaries with W3C
traceparent - Inspect latency, critical path, token usage, and cost from the CLI
- Export traces to Chrome Trace or OpenTelemetry-compatible JSON
It is intentionally small, local-first, and easy to read.
Installation
pip install tracelm
Try it immediately:
tracelm demo
tracelm init
Optional integrations:
pip install "tracelm[fastapi]"
pip install "tracelm[requests]"
pip install "tracelm[otel]"
Quick Start
The fastest path is the built-in demo:
tracelm demo
tracelm latest
tracelm export latest --format chrome
That gives you a real trace, a readable summary, and an exportable file without writing any code.
If you want a starter file generated for you:
tracelm init
tracelm run tracelm_example.py
tracelm latest
tracelm run now executes normal if __name__ == "__main__": scripts correctly, so generated starter files and standard Python entrypoints behave the way users expect.
If you want to trace your own script, create a small instrumented file:
from tracelm.decorator import node
Then run it:
tracelm run test_app.py
tracelm latest
Typical output:
Trace Summary
-------------
Trace ID: 55df12035a754aa080875618bc5794c3
Total Latency: 0.204
Total Spans: 3
Slowest Span: step2
Critical Path: __root__ -> step2
Tokens In: 0
Tokens Out: 0
Total Cost: 0
Anomalies: {'latency_spikes': ['step2']}
Duration Histogram (ms)
-----------------------
0.000-0.100: 1
0.100-0.500: 1
...
Execution Tree
--------------
__root__ (0.204 ms)
+-- step1 (0.082 ms)
\-- step2 (0.101 ms)
Features
- Hierarchical span tracing with a synthetic root span model
- W3C
traceparentparsing and continuation - FastAPI middleware integration
- Requests-based outbound propagation
- Head-based probabilistic sampling
- Critical path and slowest-span analysis
- Duration histogram generation
- Token and cost aggregation
- Trace comparison from the CLI
- Chrome Trace export
- OpenTelemetry JSON export and SDK bridge
- SQLite-backed local trace storage
FastAPI Integration
from fastapi import FastAPI
from tracelm.integrations.fastapi import TraceLMMiddleware
app = FastAPI()
app.add_middleware(TraceLMMiddleware, sample_rate=1.0)
@app.get("/")
def compute():
return {"status": "ok"}
Import the requests integration to propagate trace context on outbound HTTP calls:
import tracelm.integrations.requests
CLI Commands
Run a Python file under tracing:
tracelm run test_app.py
Apply head sampling:
tracelm run test_app.py --sample-rate 0.1
Generate a starter example file:
tracelm init
tracelm init my_flow.py
Analyze a stored trace:
tracelm analyze <trace_id>
Analyze the most recent trace:
tracelm latest
List stored traces:
tracelm list
Compare two traces:
tracelm compare <trace_id_1> <trace_id_2>
Export to Chrome Trace:
tracelm export <trace_id> --format chrome
You can also use latest anywhere a trace ID is expected:
tracelm export latest --format chrome
Export to OTEL JSON:
tracelm export <trace_id> --format otel
Design Notes
TraceLM keeps a strict execution model:
- One local root span for CLI-created traces
- Or one continuation entry span when joining an external trace
- DAG validation before profiling
- Async-safe context propagation with
ContextVar
This keeps behavior predictable and makes failures explicit.
Scope
TraceLM is currently:
- Single-process
- Developer-focused
- Local-first
- CLI-centered
It is useful for understanding and instrumenting pipelines before adopting heavier observability infrastructure.
Documentation
- Architecture:
docs/ARCHITECTURE.md - Contribution guide:
CONTRIBUTING.md
License
MIT
Project details
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file tracelm-0.4.4.tar.gz.
File metadata
- Download URL: tracelm-0.4.4.tar.gz
- Upload date:
- Size: 19.5 kB
- Tags: Source
- Uploaded using Trusted Publishing? Yes
- Uploaded via: twine/6.1.0 CPython/3.13.7
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
409b40de1a8f73da2cefa065bbb0f3bdd4a5873de99efab18a89b0993ad24e4f
|
|
| MD5 |
2e31294895ca11bdea3343917ea31fd2
|
|
| BLAKE2b-256 |
61acd6e9c250f68a28c01e87162c23d968930b4379550fdaf385229bc82b0e32
|
Provenance
The following attestation bundles were made for tracelm-0.4.4.tar.gz:
Publisher:
publish.yml on td-02/tracelm
-
Statement:
-
Statement type:
https://in-toto.io/Statement/v1 -
Predicate type:
https://docs.pypi.org/attestations/publish/v1 -
Subject name:
tracelm-0.4.4.tar.gz -
Subject digest:
409b40de1a8f73da2cefa065bbb0f3bdd4a5873de99efab18a89b0993ad24e4f - Sigstore transparency entry: 1208471062
- Sigstore integration time:
-
Permalink:
td-02/tracelm@5cf4935219986473c2bfb2790de4d3f6b4a91d99 -
Branch / Tag:
refs/tags/v0.4.4 - Owner: https://github.com/td-02
-
Access:
public
-
Token Issuer:
https://token.actions.githubusercontent.com -
Runner Environment:
github-hosted -
Publication workflow:
publish.yml@5cf4935219986473c2bfb2790de4d3f6b4a91d99 -
Trigger Event:
push
-
Statement type:
File details
Details for the file tracelm-0.4.4-py3-none-any.whl.
File metadata
- Download URL: tracelm-0.4.4-py3-none-any.whl
- Upload date:
- Size: 20.0 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? Yes
- Uploaded via: twine/6.1.0 CPython/3.13.7
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
d77cc28259978a333f144dc59f85fb3b832aa5873285aa3c80b882e6835e38a0
|
|
| MD5 |
74a125f63593c330ab7bbdb68f3ad7af
|
|
| BLAKE2b-256 |
7bba7256b5bba21ce521066c2e7e47039470b1f87656b49250e3a8d86591e75e
|
Provenance
The following attestation bundles were made for tracelm-0.4.4-py3-none-any.whl:
Publisher:
publish.yml on td-02/tracelm
-
Statement:
-
Statement type:
https://in-toto.io/Statement/v1 -
Predicate type:
https://docs.pypi.org/attestations/publish/v1 -
Subject name:
tracelm-0.4.4-py3-none-any.whl -
Subject digest:
d77cc28259978a333f144dc59f85fb3b832aa5873285aa3c80b882e6835e38a0 - Sigstore transparency entry: 1208471139
- Sigstore integration time:
-
Permalink:
td-02/tracelm@5cf4935219986473c2bfb2790de4d3f6b4a91d99 -
Branch / Tag:
refs/tags/v0.4.4 - Owner: https://github.com/td-02
-
Access:
public
-
Token Issuer:
https://token.actions.githubusercontent.com -
Runner Environment:
github-hosted -
Publication workflow:
publish.yml@5cf4935219986473c2bfb2790de4d3f6b4a91d99 -
Trigger Event:
push
-
Statement type: