Skip to main content

Speck - Development and observability toolkit for LLM apps.

Project description


Speck is a livetrace debugging and metrics tracking platform for LLM apps.

Speck streamlines LLM app development with its live debugging and metrics tracking. It simplifies prompt engineering and testing across any LLM, saving you time and enhancing your workflow.

Features

Speck's main features include:

  1. [Live LLM debugging]
  2. LLM observability
  3. Developer framework for calling models
  4. OpenAI proxy

Support

Model Support
OpenAI
AzureOpenAI
Anthropic
Replicate
LiteLLM

The dashboard on the Speck website has 4 main features:

  • Home: Dashboard for LLM usage metrics
  • Logs: Inspect recent LLM calls
  • Playground: Prompt engineer with any model
  • Live Debug: Test prompts with on-the-fly debugging

If you have any feature requests or want to stay up to date, please join our Discord community!


Getting Started

Python

pip install speck

Then, you can run something like:

from speck import Speck
client = Speck(api_key=None, api_keys={"openai": "sk-...", "anthropic": "sk-..."})
response: Response = client.chat.create(
    prompt=[{"role": "system", "content": "Count to 5."}],
    config={"model": "anthropic:claude-2"}
)

Now, each call will be logged for testing. Read more on our documentation!

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

speck-0.1.8.tar.gz (24.6 kB view hashes)

Uploaded Source

Built Distribution

speck-0.1.8-py3-none-any.whl (31.8 kB view hashes)

Uploaded Python 3

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page