A lightweight wiretap for LLM SDKs: capture all requests and responses with a single line of code
Project description
Shuntly
A lightweight wiretap for LLM SDKs: capture all requests and responses with a single line of code.
Shuntly wraps LLM SDKs to record every request and response as JSON. Calling shunt() wraps and returns a client with its original interface and types preserved, permitting consistent IDE autocomplete and type checking. Shuntly provides a collection of configurable "sinks" to write records to stderr, files, named pipes, or any combination.
While debugging LLM tooling, maybe you want to see exactly what is being sent and returned. When launching an agent, maybe you want to record every call to the LLM. Shuntly can capture it all without TLS interception, a web-based platform, or complicated logging infrastructure.
Install
pip install shuntly
Integrate
Given an LLM SDK (e.g. anthropic, openai, google-genai), simply call shunt() with the instantiated SDK class. The returned object has the same type and interface.
from anthropic import Anthropic
from shuntly import shunt
# Without providing a sink Shuntly output goes to stderr
client = shunt(Anthropic(api_key=API_KEY))
# Now use the client as before
message = client.messages.create(
model="claude-sonnet-4-20250514",
max_tokens=1024,
messages=[{"role": "user", "content": "Hello"}],
)
Each call to messages.create() writes a complete JSON record:
{
"timestamp": "2025-01-15T12:00:00+00:00",
"hostname": "dev1",
"user": "alice",
"pid": 42,
"client": "anthropic.Anthropic",
"method": "messages.create",
"request": {"model": "claude-sonnet-4-20250514", "max_tokens": 1024, "messages": [{"role": "user", "content": "Hello"}]},
"response": {"id": "msg_...", "content": [{"type": "text", "text": "Hi!"}]},
"duration_ms": 823.4,
"error": null
}
View
Shuntly JSON output can be streamed or read with a JSON viewer like fx. These tools provide JSON syntax highlighting and collapsible sections.
View Realtime Shuntly from stderr
Shuntly output, by default, goes to stderr; this is equivalent to providing a SinkStream to shunt():
from shuntly import shunt, SinkStream
client = shunt(Anthropic(api_key=API_KEY), SinkStream())
Given a command, you can view Shuntly stderr output in fx with the following:
$ command 2>&1 >/dev/null | fx
View Realtime Shuntly via a Pipe
To view Shuntly output via a named pipe in another terminal, the SinkPipe sink can be used. First, name the pipe when providing SinkPipe to shunt():
from shuntly import shunt, SinkPipe
client = shunt(Anthropic(api_key=API_KEY), SinkPipe('/tmp/shuntly.fifo'))
Then, in a terminal to view Shuntly output, create the named pipe and provide it to fx
$ mkfifo /tmp/shuntly.fifo; fx < /tmp/shuntly.fifo
Then, in another terminal, launch your command.
View Shuntly from a File
To store Shuntly output in a file, the SinkFile sink can be used. Name the file when providing SinkFile to shunt():
from shuntly import shunt, SinkFile
client = shunt(Anthropic(api_key=API_KEY), SinkFile('/tmp/shuntly.jsonl'))
Then, after your command is complete, view the file:
$ fx /tmp/shuntly.jsonl
Send Shuntly Output to Multiple Sinks
Using SinkMany, multiple sinks can be written to simultaneously.
from shuntly import shunt, SinkStream, SinkFile, SinkMany
client = shunt(Anthropic(), SinkMany([
SinkStream(),
SinkFile('/tmp/shuntly.jsonl'),
]))
Custom Sinks
Custom sinks can be implemented by subclassing Sink and implementing write():
from shuntly import Sink, ShuntlyRecord
class SinkPrint(Sink):
def write(self, record: ShuntlyRecord) -> None:
print(record.client, record.method, record.duration_ms)
Supported SDKs
Shuntly presently handles these clients:
| Client | Package | Methods |
|---|---|---|
anthropic.Anthropic |
PyPI |
messages.create, messages.stream |
openai.OpenAI |
PyPI |
chat.completions.create |
google.genai.Client |
PyPI |
models.generate_content |
For anything else, method paths can be explicitly provided:
client = shunt(my_client, methods=["chat.send", "embeddings.create"])
What is New in Shuntly
0.4.0
Renamed Record to ShuntlyRecord.
Export shunt() without Shuntly class.
0.2.0
Fully tested and integrated support for OpenAI and Google SDKs.
SinkPipe is now interruptible.
0.1.0
Initial release.
Project details
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file shuntly-0.4.0.tar.gz.
File metadata
- Download URL: shuntly-0.4.0.tar.gz
- Upload date:
- Size: 8.5 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.1.0 CPython/3.13.7
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
1d6d76f8e03ad8fd75ffceb1f0f457fb94281655b057aaf19062c6acbd1b0fdc
|
|
| MD5 |
de64dffe84f3e1c0df4782e8e0629049
|
|
| BLAKE2b-256 |
7d19f669e4600832c03cb3b61f0c557c428506d30112403b5dae83d9fff17bc0
|
File details
Details for the file shuntly-0.4.0-py3-none-any.whl.
File metadata
- Download URL: shuntly-0.4.0-py3-none-any.whl
- Upload date:
- Size: 7.4 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.1.0 CPython/3.13.7
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
1e2f22f075ce9b9fc1d91c91624c23c92bd2b66106adfd4e163c20e84d8f27aa
|
|
| MD5 |
686ac1ec924ab79557548d2879e02183
|
|
| BLAKE2b-256 |
b8c4da71e73ed5dacd3bbb3abec07f415e1e0a16f95f594478fd363d7f65bdcd
|