Skip to main content

Local-first LLM pipeline tracer. No cloud. No setup.

Project description

 ██████  ██████  ███████ ███    ██ ███████ ███    ███ ██ ████████ ██   ██ 
██    ██ ██   ██ ██      ████   ██ ██      ████  ████ ██    ██    ██   ██ 
██    ██ ██████  █████   ██ ██  ██ ███████ ██ ████ ██ ██    ██    ███████ 
██    ██ ██      ██      ██  ██ ██      ██ ██  ██  ██ ██    ██    ██   ██ 
 ██████  ██      ███████ ██   ████ ███████ ██      ██ ██    ██    ██   ██ 

Local-first LLM pipeline tracer. No cloud. No setup.

PyPI Python License Downloads Stars CI

opensmith

Local-first LLM pipeline tracer. No cloud. No setup.

Why opensmith

LangSmith is powerful, but it is built around cloud-hosted tracing and is most natural inside the LangChain ecosystem. opensmith is a local-first alternative: install it with pip, use it with any Python LLM pipeline, and inspect traces on your machine without accounts, hosted services, Docker, or configuration. No trace data leaves your machine.

Install

pip install opensmith

Quickstart

Example 1: @trace decorator

from opensmith import trace


@trace
def call_llm(prompt: str):
    return openai.chat.completions.create(
        model="gpt-4o-mini",
        messages=[{"role": "user", "content": prompt}],
    )


@trace
def my_pipeline(question: str):
    # search_docs is your own retrieval function
    docs = search_docs(question)
    return call_llm(docs + question)

Async functions are supported:

from opensmith import trace


@trace(tags=["production", "rag"])
async def call_llm(prompt: str):
    return await openai.chat.completions.create(
        model="gpt-4o-mini",
        messages=[{"role": "user", "content": prompt}],
    )

Example 2: context manager

from opensmith import trace


with trace("my_pipeline", tags=["debug"]) as t:
    t.log("query", query)
    response = openai.chat.completions.create(
        model="gpt-4o-mini",
        messages=[{"role": "user", "content": query}],
    )
    t.log("response", response)

Example 3: autopatch() zero code changes

from opensmith import autopatch


autopatch()

Patch only selected backends:

from opensmith import autopatch


autopatch(only=["openai"])

Patch everything except selected backends:

from opensmith import autopatch


autopatch(exclude=["chromadb"])

Console mode

Print trace results to the terminal as they complete:

from opensmith import set_console_mode, trace


set_console_mode(True)


@trace
def my_func():
    return "ok"

Configuration

opensmith reads opensmith.json from the current working directory on import:

{
  "db_path": "./my_traces.db",
  "console_mode": false,
  "autopatch": ["openai", "qdrant"]
}

Dashboard

opensmith ui

Open http://localhost:7823.

dashboard

CLI reference

Command Description
opensmith ui Start the local dashboard at localhost:7823.
opensmith traces List recent traces in the terminal.
opensmith stats Show aggregate trace, step, token, and cost statistics.
opensmith clear Delete all locally stored traces after confirmation.

Supported backends

Backend Package Status
openai openai
anthropic anthropic
litellm litellm
qdrant qdrant-client
chromadb chromadb
pinecone pinecone-client

Storage

Traces are stored locally at ~/.opensmith/traces.db unless overridden with opensmith.json or set_default_db_path().

Star History

Star History Chart

License

MIT

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

opensmith-0.1.2.tar.gz (114.5 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

opensmith-0.1.2-py3-none-any.whl (22.1 kB view details)

Uploaded Python 3

File details

Details for the file opensmith-0.1.2.tar.gz.

File metadata

  • Download URL: opensmith-0.1.2.tar.gz
  • Upload date:
  • Size: 114.5 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.1.0 CPython/3.12.4

File hashes

Hashes for opensmith-0.1.2.tar.gz
Algorithm Hash digest
SHA256 b04d194391fd70797149d2cbc6c7519aa446bdc4818fdcb47cdb38b19eddf3d1
MD5 593d47dc670ecafc3c0d4d4417f3c7d5
BLAKE2b-256 5033122f131e33ce20039404cc803c1e2d76156f79c0c7825c270e7c4ee9f70b

See more details on using hashes here.

File details

Details for the file opensmith-0.1.2-py3-none-any.whl.

File metadata

  • Download URL: opensmith-0.1.2-py3-none-any.whl
  • Upload date:
  • Size: 22.1 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.1.0 CPython/3.12.4

File hashes

Hashes for opensmith-0.1.2-py3-none-any.whl
Algorithm Hash digest
SHA256 a926a69e6d4875c4213b91e1f2e9315c614b6931de301672f69231453b2e060a
MD5 79474b499e1f2b31082fe13a7eb4beeb
BLAKE2b-256 3e7cb40938e895efdc123c8751e272889c71c21dcaf785344a517e1e19d17328

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page