Skip to main content

Two-line automatic LLM tracing. Works with OpenAI, Anthropic, and more.

Project description

SpyLLM Python SDK

PyPI Version PyPI Downloads License: MIT Python GitHub Stars

Automatic LLM tracing in two lines. Works with OpenAI, Anthropic, and more.

See it in actionview a live trace on the dashboard

Prerequisites

You need a free SpyLLM account and an API key to use this SDK.

  1. Sign up at spyllm.dev/sign-up
  2. Go to Settings → API Keys and click Create API Key
  3. Copy the key — it is only shown once

Install

pip install spyllm

With provider extras:

pip install spyllm[openai]       # OpenAI
pip install spyllm[anthropic]    # Anthropic
pip install spyllm[otel]         # OpenTelemetry export

Quick Start

import spyllm

spyllm.init(api_key="sk-...")

# That's it. Every OpenAI and Anthropic call is now automatically traced.
from openai import OpenAI

client = OpenAI()
response = client.chat.completions.create(
    model="gpt-4o",
    messages=[{"role": "user", "content": "Hello!"}],
)
# Prompt, response, tokens, cost, and latency are captured automatically.

Open the dashboard to see traces as they arrive.

What Gets Captured

Every LLM call automatically records:

  • Prompt — full message history sent to the model
  • Response — the model's output
  • Token count — input + output tokens
  • Cost — estimated USD cost based on model pricing
  • Latency — wall-clock time for the API call
  • Tool calls — if the model invoked tools/functions
  • Errors — failed calls with the exception message

Supported Providers

Provider Auto-instrumented
OpenAI Yes
Anthropic Yes

Advanced Usage

Manual Tracing

from spyllm import SpyLLMClient

client = SpyLLMClient(api_key="sk-...", base_url="https://api.spyllm.dev")
client.trace(
    agent_name="my-agent",
    prompt="What is 2+2?",
    response="4",
    token_count=15,
    cost_usd=0.001,
)

Decorator

from spyllm import agent_trace, init

init(api_key="sk-...")

@agent_trace("my-pipeline")
def run_pipeline(query: str) -> str:
    # your code here
    return result

Disable Auto-instrumentation

spyllm.init(api_key="sk-...", instrument=False)

Self-hosted

Point the SDK at your own instance:

spyllm.init(api_key="sk-...", base_url="https://your-host.com")

Documentation

Changelog

See GitHub Releases for a full changelog.

License

MIT

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

spyllm-0.2.2.tar.gz (8.7 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

spyllm-0.2.2-py3-none-any.whl (11.0 kB view details)

Uploaded Python 3

File details

Details for the file spyllm-0.2.2.tar.gz.

File metadata

  • Download URL: spyllm-0.2.2.tar.gz
  • Upload date:
  • Size: 8.7 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.13.12

File hashes

Hashes for spyllm-0.2.2.tar.gz
Algorithm Hash digest
SHA256 b74fc5cd59c212f940547a3940ab2a80fa657b545b450b23353c2adb0dd844d8
MD5 a004de783ad1f30a7a7f421e63076854
BLAKE2b-256 de2601a507eb67bad68cd9b8f141b479bfaf9fb933fa9170023a1b6e8bf98c2f

See more details on using hashes here.

Provenance

The following attestation bundles were made for spyllm-0.2.2.tar.gz:

Publisher: sdk-publish.yml on Yemnis/spyllm

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file spyllm-0.2.2-py3-none-any.whl.

File metadata

  • Download URL: spyllm-0.2.2-py3-none-any.whl
  • Upload date:
  • Size: 11.0 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.13.12

File hashes

Hashes for spyllm-0.2.2-py3-none-any.whl
Algorithm Hash digest
SHA256 401108064d1d7b8ef4eb8b0d6b9deb30e4ed101ae418639db8f8b78e4a149c3c
MD5 5feaef921c9f6eedf346ce405323ee02
BLAKE2b-256 d075339ca67d5de0d5099bcb6cc2098f9aa9b27e94ccb683cc63f4760263958b

See more details on using hashes here.

Provenance

The following attestation bundles were made for spyllm-0.2.2-py3-none-any.whl:

Publisher: sdk-publish.yml on Yemnis/spyllm

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page