Skip to main content

Two-line automatic LLM tracing. Works with OpenAI, Anthropic, and more.

Project description

SpyLLM Python SDK

Automatic LLM tracing in two lines. Works with OpenAI, Anthropic, and more.

Prerequisites

You need a free SpyLLM account and an API key to use this SDK.

  1. Sign up at spyllm.dev/sign-up
  2. Go to Settings → API Keys and click Create API Key
  3. Copy the key — it is only shown once

Install

pip install spyllm

With provider extras:

pip install spyllm[openai]       # OpenAI
pip install spyllm[anthropic]    # Anthropic
pip install spyllm[otel]         # OpenTelemetry export

Quick Start

import spyllm

spyllm.init(api_key="sk-...")

# That's it. Every OpenAI and Anthropic call is now automatically traced.
from openai import OpenAI

client = OpenAI()
response = client.chat.completions.create(
    model="gpt-4o",
    messages=[{"role": "user", "content": "Hello!"}],
)
# Prompt, response, tokens, cost, and latency are captured automatically.

Open the dashboard to see traces as they arrive.

What Gets Captured

Every LLM call automatically records:

  • Prompt — full message history sent to the model
  • Response — the model's output
  • Token count — input + output tokens
  • Cost — estimated USD cost based on model pricing
  • Latency — wall-clock time for the API call
  • Tool calls — if the model invoked tools/functions
  • Errors — failed calls with the exception message

Supported Providers

Provider Auto-instrumented
OpenAI Yes
Anthropic Yes

Advanced Usage

Manual Tracing

from spyllm import SpyLLMClient

client = SpyLLMClient(api_key="sk-...", base_url="https://api.spyllm.dev")
client.trace(
    agent_name="my-agent",
    prompt="What is 2+2?",
    response="4",
    token_count=15,
    cost_usd=0.001,
)

Decorator

from spyllm import agent_trace, init

init(api_key="sk-...")

@agent_trace("my-pipeline")
def run_pipeline(query: str) -> str:
    # your code here
    return result

Disable Auto-instrumentation

spyllm.init(api_key="sk-...", instrument=False)

Self-hosted

Point the SDK at your own instance:

spyllm.init(api_key="sk-...", base_url="https://your-host.com")

Documentation

License

MIT

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

spyllm-0.2.1.tar.gz (8.5 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

spyllm-0.2.1-py3-none-any.whl (10.7 kB view details)

Uploaded Python 3

File details

Details for the file spyllm-0.2.1.tar.gz.

File metadata

  • Download URL: spyllm-0.2.1.tar.gz
  • Upload date:
  • Size: 8.5 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.13.12

File hashes

Hashes for spyllm-0.2.1.tar.gz
Algorithm Hash digest
SHA256 e1c91abd49f3db6ce99b5d56ef8352685a8e4da0ab61ecc1bdcede1e4bd5b793
MD5 6eb7f129b07837daa66284b44121b037
BLAKE2b-256 f616ba9f0b6e6d2fc34c51713794a4595f951cd68351e1f6b563a265e6fd9811

See more details on using hashes here.

Provenance

The following attestation bundles were made for spyllm-0.2.1.tar.gz:

Publisher: sdk-publish.yml on Yemnis/spyllm

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file spyllm-0.2.1-py3-none-any.whl.

File metadata

  • Download URL: spyllm-0.2.1-py3-none-any.whl
  • Upload date:
  • Size: 10.7 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.13.12

File hashes

Hashes for spyllm-0.2.1-py3-none-any.whl
Algorithm Hash digest
SHA256 ef3bdfdcc0b81735df8bbac6c1fd4fe5291920c959073adbe893370573146774
MD5 040b30a024536cf54b9f0d8e693dcf0a
BLAKE2b-256 912d09d293e6c2bbb8f23f7804266976c4864cd865610e60f11ec9e954199710

See more details on using hashes here.

Provenance

The following attestation bundles were made for spyllm-0.2.1-py3-none-any.whl:

Publisher: sdk-publish.yml on Yemnis/spyllm

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page