Skip to main content

Lightweight LLM observability SDK

Project description

LLM Watch SDK

Lightweight SDK to send LLM usage telemetry to the LLM Watch backend.

Install

pip install llm-watch-sdk

Quickstart

LLM Watch reads config from env by default:

LLMWATCH_BACKEND_URL=http://127.0.0.1:8000
LLMWATCH_API_KEY=YOUR_PROJECT_KEY
from llm_watch import LLMWatch

watch = LLMWatch()  # reads env

You can still pass backend_url and project_api_key explicitly.

Providers

Instrument your provider client with the matching adapter, then call it as usual.

# OpenAI example (instrument existing client)
from openai import OpenAI

client = OpenAI(api_key="...")
client = watch.instrument_openai(client, model="gpt-4o-mini")

with watch.trace(actor_id="user-42"):
    resp = client.chat.completions.create(
        model="gpt-4o-mini",
        messages=[{"role": "user", "content": "hello"}],
    )
# Gemini example (google.genai)
from google import genai

client = genai.Client(api_key="...")
gm = watch.instrument_gemini(client, model="models/gemini-pro-latest")

with watch.trace(actor_id="user-42"):
    resp = gm.models.generate_content(
        model="models/gemini-pro-latest",
        contents="hello",
    )
# Bedrock example (wrap existing client)
import boto3

br = boto3.client("bedrock-runtime", region_name="eu-north-1")
br = watch.instrument_bedrock(client=br, model_id="arn:aws:bedrock:...")

with watch.trace(actor_id="user-42"):
    resp = br.invoke_model(
        modelId="arn:aws:bedrock:...",
        body=b'{"messages":[{"role":"user","content":[{"text":"hello"}]}]}',
        contentType="application/json",
        accept="application/json",
    )

Manual logging (if you prefer)

watch.log_llm_call(
    provider="openai",
    model="gpt-4o-mini",
    input_tokens=120,
    output_tokens=320,
    total_tokens=440,
    latency_ms=780,
    actor_id="user-42",
)

watch.trace() is optional. If you don’t create a trace, LLM Watch will auto-create one per call. Use trace() to group calls under a single workflow.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

llm_watch_sdk-0.1.6.tar.gz (20.1 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

llm_watch_sdk-0.1.6-py3-none-any.whl (30.1 kB view details)

Uploaded Python 3

File details

Details for the file llm_watch_sdk-0.1.6.tar.gz.

File metadata

  • Download URL: llm_watch_sdk-0.1.6.tar.gz
  • Upload date:
  • Size: 20.1 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.12.6

File hashes

Hashes for llm_watch_sdk-0.1.6.tar.gz
Algorithm Hash digest
SHA256 2f5814bcbd7fc542bfca4692982b0c111eec599713458f68822d05ef7221c173
MD5 2670ed35b67835e4c1b415a6179baf99
BLAKE2b-256 b91ffaa13584431d22e25f26ea225237f7e2b29ea1fff681435f6e79dc47a817

See more details on using hashes here.

File details

Details for the file llm_watch_sdk-0.1.6-py3-none-any.whl.

File metadata

  • Download URL: llm_watch_sdk-0.1.6-py3-none-any.whl
  • Upload date:
  • Size: 30.1 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.12.6

File hashes

Hashes for llm_watch_sdk-0.1.6-py3-none-any.whl
Algorithm Hash digest
SHA256 ace66ee28c1b3aabd4d4d231ae6f96f15eda8e29d2f6d68a408b6b273a5b772a
MD5 5e31882bfc59dbfbdbddb6d08ac6c469
BLAKE2b-256 043628ca594caf648f3161a063b0eb0f95a69ed50b4ea778d93dae8029e845db

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page