Local-first LLM pipeline tracer. No cloud. No setup.
Project description
██████ ██████ ███████ ███ ██ ███████ ███ ███ ██ ████████ ██ ██ ██ ██ ██ ██ ██ ████ ██ ██ ████ ████ ██ ██ ██ ██ ██ ██ ██████ █████ ██ ██ ██ ███████ ██ ████ ██ ██ ██ ███████ ██ ██ ██ ██ ██ ██ ██ ██ ██ ██ ██ ██ ██ ██ ██ ██████ ██ ███████ ██ ████ ███████ ██ ██ ██ ██ ██ ██
Local-first LLM pipeline tracer. No cloud. No setup.
opensmith
Local-first LLM pipeline tracer. No cloud. No setup.
Why opensmith
LangSmith is powerful, but it is built around cloud-hosted tracing and is most natural inside the LangChain ecosystem. opensmith is a local-first alternative: install it with pip, use it with any Python LLM pipeline, and inspect traces on your machine without accounts, hosted services, Docker, or configuration. No trace data leaves your machine.
Install
pip install opensmith
Quickstart
Example 1: @trace decorator
from opensmith import trace
@trace
def call_llm(prompt: str):
return openai.chat.completions.create(
model="gpt-4o-mini",
messages=[{"role": "user", "content": prompt}],
)
@trace
def my_pipeline(question: str):
# search_docs is your own retrieval function
docs = search_docs(question)
return call_llm(docs + question)
Example 2: context manager
from opensmith import trace
with trace("my_pipeline") as t:
t.log("query", query)
response = openai.chat.completions.create(
model="gpt-4o-mini",
messages=[{"role": "user", "content": query}],
)
t.log("response", response)
Example 3: autopatch() zero code changes
from opensmith import autopatch
autopatch()
Patch only selected backends:
from opensmith import autopatch
autopatch(only=["openai"])
Patch everything except selected backends:
from opensmith import autopatch
autopatch(exclude=["chromadb"])
Dashboard
opensmith ui
Open http://localhost:7823.
CLI reference
| Command | Description |
|---|---|
opensmith ui |
Start the local dashboard at localhost:7823. |
opensmith traces |
List recent traces in the terminal. |
opensmith stats |
Show aggregate trace, step, token, and cost statistics. |
opensmith clear |
Delete all locally stored traces after confirmation. |
Supported backends
| Backend | Package | Status |
|---|---|---|
| openai | openai | ✅ |
| anthropic | anthropic | ✅ |
| litellm | litellm | ✅ |
| qdrant | qdrant-client | ✅ |
| chromadb | chromadb | ✅ |
| pinecone | pinecone-client | ✅ |
Storage
Traces are stored locally at ~/.opensmith/traces.db.
License
MIT
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file opensmith-0.1.0.tar.gz.
File metadata
- Download URL: opensmith-0.1.0.tar.gz
- Upload date:
- Size: 14.0 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.1.0 CPython/3.12.4
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
6c222f10cc9649a80cbce8a6f86b9fed7f0b4d46212aa9c3ef4eebe133b3ad92
|
|
| MD5 |
d9c6cbc313f8911dc54a194c40a8b79a
|
|
| BLAKE2b-256 |
361cab13ac7454cd7804ae0f418b2c0053e7d582468fc42ded743f0e9e18eac6
|
File details
Details for the file opensmith-0.1.0-py3-none-any.whl.
File metadata
- Download URL: opensmith-0.1.0-py3-none-any.whl
- Upload date:
- Size: 15.3 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.1.0 CPython/3.12.4
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
6c04449974cf6666e012798f1d59b57c271225df03c1cf1fd3d9e8d3274ac0b9
|
|
| MD5 |
f731fc9306b2355be7b13fa77a44afc6
|
|
| BLAKE2b-256 |
833e04bf873cd03e6c72b89fb7f718b4c840493e72932d3ef37054b4cf81e156
|