Vendor-neutral LLM observability SDK - instrument once, observe anywhere
Project description
Vendor-neutral LLM observability — instrument once, observe anywhere.
[!WARNING] This project is under active development. APIs may change between releases and it is not yet production-ready. Use with that in mind.
TraceCraft is a Python observability SDK with a built-in Terminal UI (TUI) that lets you visually explore, debug, and analyze agent traces right in your terminal — no browser, no cloud dashboard, no waiting.
The fastest path: zero code changes
If your app already uses OpenAI, Anthropic, LangChain, LlamaIndex, or any OpenTelemetry-compatible framework, TraceCraft can observe it without touching a single line of application code.
Step 1 — Install and set one environment variable:
pip install "tracecraft[receiver,tui]"
export OTEL_EXPORTER_OTLP_ENDPOINT=http://localhost:4318
Step 2 — Start the receiver and TUI together:
tracecraft serve --tui
Step 3 — Run your existing app unchanged:
python your_app.py
Traces from any OTLP-compatible framework (OpenLLMetry, LangChain, LlamaIndex, DSPy, or any standard OpenTelemetry SDK) stream live into the TUI the moment they arrive. No init() call. No decorators. No code changes.
All your agent runs at a glance — name, duration, token usage, and status.
Hierarchical waterfall view with timing bars. See exactly where your agent spends its time. Navigate to any LLM step and press i for the prompt, o for the response, or a for attributes.
Path 2 — Config file + one line
When you want a persistent local setup — custom service name, JSONL export, PII redaction — drop a config file into your project and add one line to your app:
.tracecraft/config.yaml:
# .tracecraft/config.yaml
default:
exporters:
receiver: true # stream to tracecraft serve --tui
instrumentation:
auto_instrument: true # patches OpenAI, Anthropic, LangChain, LlamaIndex
Your app:
import tracecraft
tracecraft.init() # reads .tracecraft/config.yaml automatically
Then start the TUI:
tracecraft serve --tui
Or, if you prefer to write traces to a file and open the TUI separately:
tracecraft tui
Note: Call
tracecraft.init()before importing any LLM SDK. TraceCraft patches SDKs at import time — importing first means the patch won't apply.
SDK decorators
For fine-grained control — custom span names, explicit inputs/outputs, structured step hierarchies — TraceCraft provides @trace_agent, @trace_tool, @trace_llm, and @trace_retrieval decorators, plus a step() context manager for inline instrumentation. See the SDK Guide for details.
TUI Keyboard Shortcuts
| Key | Action |
|---|---|
↑ / ↓ |
Navigate traces |
Enter |
Expand trace — shows waterfall |
i |
View input/prompt |
o |
View output/response |
a |
View attributes and metadata |
/ |
Filter traces |
Tab |
Cycle view modes |
m + C |
Mark and compare two traces |
p |
Open playground |
q |
Quit |
Why TraceCraft?
| Feature | TraceCraft | LangSmith | Langfuse | Phoenix |
|---|---|---|---|---|
| Terminal UI | Yes — built-in | No | No | No |
| Zero-Code Instrumentation | Yes | No | No | No |
| Vendor Lock-in | None | LangChain | Langfuse | Arize |
| Local Development | Full offline | Cloud required | Self-host | Self-host |
| OpenTelemetry Native | Built on OTel | Proprietary | Proprietary | OTel compatible |
| PII Redaction | SDK-level | Backend only | Backend only | Backend only |
| Cost | Free & Open Source | Paid tiers | Paid tiers | Paid tiers |
Features
- Built-in Terminal UI: Explore, filter, compare, and debug traces without leaving your terminal
- Local-First: All traces stay on your machine — the TUI is fully offline
- Zero-Code OTLP Receiver: Set one env var, run
tracecraft serve --tui, observe any OTLP app - Auto-Instrumentation: Two lines capture all OpenAI, Anthropic, LangChain, and LlamaIndex calls automatically
- Decorators:
@trace_agent,@trace_tool,@trace_llm,@trace_retrievalfor custom tracing - Dual-Dialect Schema: OTel GenAI and OpenInference conventions
- PII Redaction: Client-side redaction before data ever leaves your app
- Export Anywhere: Console, JSONL, SQLite, OTLP, MLflow, HTML reports
Installation
# OTLP receiver + TUI (zero code changes path)
pip install "tracecraft[receiver,tui]"
# TUI + auto-instrumentation
pip install "tracecraft[auto,tui]"
# With specific frameworks
pip install "tracecraft[langchain,tui]"
pip install "tracecraft[llamaindex,tui]"
# All features
pip install "tracecraft[all]"
Or with uv:
uv add "tracecraft[auto,tui]"
Framework Support
| Framework | Status | Installation |
|---|---|---|
| OpenAI | Stable (auto) | tracecraft[auto] |
| Anthropic | Stable (auto) | tracecraft[auto] |
| LangChain | Beta | tracecraft[langchain] |
| LlamaIndex | Beta | tracecraft[llamaindex] |
| PydanticAI | Beta | tracecraft[pydantic-ai] |
| Claude SDK | Beta | tracecraft[claude-sdk] |
| Custom Code | Stable | Base package |
Configuration
import tracecraft
tracecraft.init(
service_name="my-agent-service",
jsonl=True, # Enable JSONL output for the TUI
console=True, # Pretty-print to console
auto_instrument=True, # Auto-capture OpenAI/Anthropic calls
enable_pii_redaction=True,
sampling_rate=1.0,
)
Environment variables:
export TRACECRAFT_SERVICE_NAME=my-service
export TRACECRAFT_ENVIRONMENT=production
export TRACECRAFT_SAMPLING_RATE=0.1
export TRACECRAFT_OTLP_ENDPOINT=http://localhost:4317
Export to Any Backend
from tracecraft.exporters import OTLPExporter
tracecraft.init(
jsonl=True, # Keep local TUI access
exporters=[
OTLPExporter(endpoint="http://localhost:4317"), # Jaeger, Grafana, etc.
],
)
Supported backends: Langfuse, Datadog, Phoenix (Arize), Jaeger, Grafana Tempo, Honeycomb, any OTLP system.
Documentation
Contributing
We welcome contributions! See CONTRIBUTING.md for guidelines.
Development Setup
git clone https://github.com/LocalAI/tracecraft.git
cd tracecraft
uv sync --all-extras
uv run pre-commit install
uv run pytest
Security
See SECURITY.md for security concerns.
License
Apache-2.0 — See LICENSE for details.
Made with care by the TraceCraft Contributors
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file tracecraft-0.2.0.tar.gz.
File metadata
- Download URL: tracecraft-0.2.0.tar.gz
- Upload date:
- Size: 577.5 kB
- Tags: Source
- Uploaded using Trusted Publishing? Yes
- Uploaded via: twine/6.1.0 CPython/3.13.7
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
303a269e892957ac637041897b4ceaca7f65c8d28c9c8d3aa95e10e6b37c8617
|
|
| MD5 |
c81ed33538c8eef9f32aae485c21802a
|
|
| BLAKE2b-256 |
2eaaf01c0529b18efc5621df7d5fe992166c0df2d32c7c1481f7c150a9db1eba
|
Provenance
The following attestation bundles were made for tracecraft-0.2.0.tar.gz:
Publisher:
release.yml on LocalAI/tracecraft
-
Statement:
-
Statement type:
https://in-toto.io/Statement/v1 -
Predicate type:
https://docs.pypi.org/attestations/publish/v1 -
Subject name:
tracecraft-0.2.0.tar.gz -
Subject digest:
303a269e892957ac637041897b4ceaca7f65c8d28c9c8d3aa95e10e6b37c8617 - Sigstore transparency entry: 1009855558
- Sigstore integration time:
-
Permalink:
LocalAI/tracecraft@74c37dde027f9999b030ba9bf3239afd20ad0bfb -
Branch / Tag:
refs/tags/v0.2.0 - Owner: https://github.com/LocalAI
-
Access:
public
-
Token Issuer:
https://token.actions.githubusercontent.com -
Runner Environment:
github-hosted -
Publication workflow:
release.yml@74c37dde027f9999b030ba9bf3239afd20ad0bfb -
Trigger Event:
push
-
Statement type:
File details
Details for the file tracecraft-0.2.0-py3-none-any.whl.
File metadata
- Download URL: tracecraft-0.2.0-py3-none-any.whl
- Upload date:
- Size: 301.5 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? Yes
- Uploaded via: twine/6.1.0 CPython/3.13.7
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
e395dcacdb69ed41a293718d91082cb7b2e173bc6af4ca2f8662b762793b7a23
|
|
| MD5 |
5d354d12ad3eea00561f1e3ace9e0917
|
|
| BLAKE2b-256 |
095628f48600752f5f06353385fbef540cdaadbf15f4686f6bdfc18fd1854282
|
Provenance
The following attestation bundles were made for tracecraft-0.2.0-py3-none-any.whl:
Publisher:
release.yml on LocalAI/tracecraft
-
Statement:
-
Statement type:
https://in-toto.io/Statement/v1 -
Predicate type:
https://docs.pypi.org/attestations/publish/v1 -
Subject name:
tracecraft-0.2.0-py3-none-any.whl -
Subject digest:
e395dcacdb69ed41a293718d91082cb7b2e173bc6af4ca2f8662b762793b7a23 - Sigstore transparency entry: 1009855606
- Sigstore integration time:
-
Permalink:
LocalAI/tracecraft@74c37dde027f9999b030ba9bf3239afd20ad0bfb -
Branch / Tag:
refs/tags/v0.2.0 - Owner: https://github.com/LocalAI
-
Access:
public
-
Token Issuer:
https://token.actions.githubusercontent.com -
Runner Environment:
github-hosted -
Publication workflow:
release.yml@74c37dde027f9999b030ba9bf3239afd20ad0bfb -
Trigger Event:
push
-
Statement type: