Skip to main content

A lightweight wiretap for LLM SDKs: capture all requests and responses with a single line of code

Project description

Shuntly

A lightweight wiretap for LLM SDKs: capture all requests and responses with a single line of code.

Shuntly wraps LLM SDKs to record every request and response as JSON. Calling Shuntly.shunt() wraps and returns a client with its original interface and types preserved, permitting consistent IDE autocomplete and type checking. Shuntly provides a collection of configurable "sinks" to write records to stderr, files, named pipes, or any combination.

While debugging LLM tooling, maybe you want to see exactly what is being sent and returned. When launching an agent, maybe you want to record every call to the LLM. Shuntly can capture it all without network components, a web-based platform, or complicated logging infrastructure.

Install

pip install shuntly

Integrate

Given an LLM SDK (e.g. anthropic, openai, google-genai), simply call Shuntly.shunt() with the instantiated SDK class. The returned object has the same type and interface.

from anthropic import Anthropic
from shuntly import Shuntly

# Without providing a sink Shuntly output goes to stderr
client = Shuntly.shunt(Anthropic(api_key=API_KEY))

# Now use the client as before
message = client.messages.create(
    model="claude-sonnet-4-20250514",
    max_tokens=1024,
    messages=[{"role": "user", "content": "Hello"}],
)

Each call to messages.create() writes a complete JSON record:

{
  "timestamp": "2025-01-15T12:00:00+00:00",
  "hostname": "dev1",
  "user": "alice",
  "pid": 42,
  "client": "anthropic.Anthropic",
  "method": "messages.create",
  "request": {"model": "claude-sonnet-4-20250514", "max_tokens": 1024, "messages": [{"role": "user", "content": "Hello"}]},
  "response": {"id": "msg_...", "content": [{"type": "text", "text": "Hi!"}]},
  "duration_ms": 823.4,
  "error": null
}

View

Shuntly JSON output can be streamed or read with a JSON viewer like fx. These tools provide JSON syntax highlighting and collapsible sections.

View Realtime Shuntly from stderr

Shuntly output, by default, goes to stderr; this is equivalent to providing a SinkStream to shunt():

from shuntly import Shuntly, SinkStream
client = Shuntly.shunt(Anthropic(api_key=API_KEY), SinkStream())

Given a command, you can view Shuntly stderr output in fx with the following:

$ command 2>&1 >/dev/null | fx

View Realtime Shuntly via a Pipe

To view Shuntly output via a named pipe in another terminal, the SinkPipe sink can be used. First, name the pipe when providing SinkPipe to shunt():

from shuntly import Shuntly, SinkPipe
client = Shuntly.shunt(Anthropic(api_key=API_KEY), SinkPipe('/tmp/shuntly.fifo'))

Then, in a terminal to view Shuntly output, create the named pipe and provide it to fx

$ mkfifo /tmp/shuntly.fifo; fx < /tmp/shuntly.fifo

Then, in another terminal, launch your command.

View Shuntly from a File

To store Shuntly output in a file, the SinkFile sink can be used. Name the file when providing SinkFile to shunt():

from shuntly import Shuntly, SinkFile
client = Shuntly.shunt(Anthropic(api_key=API_KEY), SinkFile('/tmp/shuntly.jsonl'))

Then, after your command is complete, view the file:

$ fx /tmp/shuntly.jsonl

Send Shuntly Output to Multiple Sinks

Using SinkMany, multiple sinks can be written to simultaneously.

from shuntly import Shuntly, SinkStream, SinkFile, SinkMany

client = Shuntly.shunt(Anthropic(), SinkMany([
    SinkStream(),
    SinkFile('/tmp/shuntly.jsonl'),
]))

Custom Sinks

Custom sinks can be implemented by subclassing Sink and implementing write():

from shuntly import Sink, Record

class SinkPrint(Sink):
    def write(self, record: Record) -> None:
        print(record.client, record.method, record.duration_ms)

Supported SDKs

Shuntly presently handles these clients:

Client Package Methods
anthropic.Anthropic PyPI messages.create, messages.stream
openai.OpenAI PyPI chat.completions.create
google.genai.Client PyPI models.generate_content

For anything else, method paths can be explicitly provided:

client = Shuntly.shunt(my_client, methods=["chat.send", "embeddings.create"])

What is New in Shuntly

0.2.0

Fully tested and integrated support for OpenAI and Google SDKs.

SinkPipe is now interruptible.

0.1.0

Initial release.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

shuntly-0.2.0.tar.gz (8.3 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

shuntly-0.2.0-py3-none-any.whl (7.2 kB view details)

Uploaded Python 3

File details

Details for the file shuntly-0.2.0.tar.gz.

File metadata

  • Download URL: shuntly-0.2.0.tar.gz
  • Upload date:
  • Size: 8.3 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.1.0 CPython/3.13.7

File hashes

Hashes for shuntly-0.2.0.tar.gz
Algorithm Hash digest
SHA256 f1a87c79ea1c31e6baa8c39ad44e906b72fcfda0466d7472bcadd497759df872
MD5 8ee1da3aec005f986cfd66960e22c3f2
BLAKE2b-256 2cf7dfcf6be472ae717cf75c6bdd8b99d872229c61652950b3bb732ba48a9a76

See more details on using hashes here.

File details

Details for the file shuntly-0.2.0-py3-none-any.whl.

File metadata

  • Download URL: shuntly-0.2.0-py3-none-any.whl
  • Upload date:
  • Size: 7.2 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.1.0 CPython/3.13.7

File hashes

Hashes for shuntly-0.2.0-py3-none-any.whl
Algorithm Hash digest
SHA256 d2f113ae0e575f10a58a2d76155ebbb74f505ef60e3214f62da41f1133466da4
MD5 93574eb0fd79c7a185736a8366af92eb
BLAKE2b-256 77150594d88139c871d087f56249dca93227d705055d6db00043db2564c10401

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page