Skip to main content

Low-latency interprocess communication via named pipes for Python agent/service workflows

Project description

Named Pipes as Agentic Tools

Low-latency IPC for persistent AI tool servers — LLM inference, TTS, STT, vector search, and more — all on one machine, no network stack required.


✨ Highlights

  • Persistent servers — model weights and state stay loaded between calls; no per-request startup cost
  • Kernel-speed IPC — named pipes route through kernel memory, not a network stack; lower latency than local HTTP
  • Multi-client fanout — one server handles many concurrent clients; each gets its own downstream pipe
  • Decorator API — register command handlers with a single @ch.handler("CMD") line
  • cpipe CLI — send ad-hoc commands to any running server from the terminal, like curl for pipes
  • Claude Code skill — an included skill teaches the assistant to discover and query live servers without leaving the session
  • Ready-made servers — drop-in pipes for LLM chat, text-to-speech, and speech-to-text

Overview

This library uses named pipes as the transport layer for agentic tool servers — persistent background processes that expose capabilities such as LLM inference, text-to-speech, vector search, or browser automation to a Python orchestrator running on the same machine.

Because named pipes route data through kernel memory rather than a network stack, they offer lower latency than local HTTP and far less complexity than shared memory — a practical sweet spot for real-time applications like voice agents.

The same servers can be driven directly from Claude Code. An included agent skill teaches the assistant how to discover running pipe servers with cpipe --list, inspect their capabilities, and send commands.

For a deeper look at the design decisions and API reference, see DOCS.md.

Installation

# Core library only
pip install -e .

# With LLM inference support
pip install -e ".[llm]"

# With TTS support (macOS: mlx-audio + sounddevice)
pip install -e ".[tts]"

# With STT support (sounddevice; Voxtral weights vendored)
pip install -e ".[stt]"

Requires Python 3.11+. See DOCS.md for platform-specific dependency details.

Quick start

1. Start a server (Terminal 1):

conda activate named-pipes
cpipe --serve chat   # LLM server on /tmp/tool-chat

2. Query it from the CLI (Terminal 2):

cpipe /tmp/tool-chat chat --data '{"messages": [{"role":"user","content":"Hello!"}]}'

3. Or write a client in Python:

from named_pipes.tool_named_pipe import ToolNamedPipe, Role

with ToolNamedPipe("tool-chat", role=Role.CLIENT) as ch:
    ch.send_message("chat", '{"messages": [{"role":"user","content":"Hello!"}]}')
    for msg in ch.receive_stream():
        print(msg)

Examples

Start order matters — server first, then client (server creates the FIFOs).

# LLM chat
cpipe --serve chat                  # Terminal 1
python src/ex_chat_pipe/client.py   # Terminal 2

# LLM → TTS pipeline (spoken output)
cpipe --serve chat                  # Terminal 1: LLM  (/tmp/tool-chat)
cpipe --serve tts                   # Terminal 2: TTS  (/tmp/tool-tts)
python src/ex_tts_pipe/client.py    # Terminal 3: pipeline client

# Speech-to-text
cpipe --serve stt                   # Terminal 1: STT  (/tmp/tool-stt)
python src/ex_stt_pipe/client.py    # Terminal 2: subscriber

cpipe — CLI tool

cpipe /tmp/tool-chat chat --data '{"messages": [{"role":"user","content":"Hello"}]}'

cpipe --list    # discover running ToolNamedPipe servers (tool-* pipes)
cpipe --pid     # same, plus PIDs that have each pipe open
cpipe --clear   # delete orphaned tool pipes

See DOCS.md for all options and the full protocol reference.

Claude Code skill

An included skill at .claude/skills/cpipe/SKILL.md teaches Claude Code how to use cpipe to discover, inspect, and interact with live servers — so the LLM can query a local inference server or trigger TTS playback without leaving the coding session.

Resources

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

named_pipes-0.3.0.tar.gz (49.9 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

named_pipes-0.3.0-py3-none-any.whl (55.9 kB view details)

Uploaded Python 3

File details

Details for the file named_pipes-0.3.0.tar.gz.

File metadata

  • Download URL: named_pipes-0.3.0.tar.gz
  • Upload date:
  • Size: 49.9 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.13.12

File hashes

Hashes for named_pipes-0.3.0.tar.gz
Algorithm Hash digest
SHA256 8359abaa734700c2fa76e950219627d22dc8b35d7d2ed371bc0918cb45f8360a
MD5 367aa8af100cfe4161090b6b2346c8ac
BLAKE2b-256 8454f129e00ac09828ad8d5bf2a7aa390a0fa0df17528dd86998f1bcd8be4292

See more details on using hashes here.

Provenance

The following attestation bundles were made for named_pipes-0.3.0.tar.gz:

Publisher: publish.yml on stefanwebb/named-pipes

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file named_pipes-0.3.0-py3-none-any.whl.

File metadata

  • Download URL: named_pipes-0.3.0-py3-none-any.whl
  • Upload date:
  • Size: 55.9 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.13.12

File hashes

Hashes for named_pipes-0.3.0-py3-none-any.whl
Algorithm Hash digest
SHA256 9cbc90b60e15e8953a1fa86a1af482087200e9a17b0e3ce1c8dbf13dcdda60c5
MD5 1cc7191331b76e50a7ba1eb6b2bbcdb9
BLAKE2b-256 daa403fc2b4fc09f114dba3fcc5b0ac732f9df9689ad772b8fa162431a8e8c92

See more details on using hashes here.

Provenance

The following attestation bundles were made for named_pipes-0.3.0-py3-none-any.whl:

Publisher: publish.yml on stefanwebb/named-pipes

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page