Skip to main content

Protocol-faithful Python wrapper for Pi RPC mode

Project description

picable

picable is a protocol-faithful Python wrapper for pi --mode rpc.

Homepage and source: https://github.com/sabaini/picable

It starts Pi lazily, communicates over strict JSONL on stdin/stdout, exposes typed commands/responses/events, and supports bounded, queue-like event subscriptions.

Features

  • lazy subprocess startup; PiClient() does not spawn Pi
  • one Python method per documented RPC command
  • strict LF-delimited JSONL framing
  • typed parsing for models, messages, responses, and events
  • multiple event subscribers via bounded per-subscriber queues
  • idle-only subprocess restart when Pi dies between commands
  • integration tests against a real pi --mode rpc subprocess, with a bundled mock backend by default
  • smoke coverage for the shipped example scripts, the Streamlit dataset-triage app, and an installed wheel in a clean virtualenv

Installation

pip install picable

Requires Python 3.11 or newer.

For local development:

python -m venv .venv
. .venv/bin/activate
pip install -e .[dev]

Example-only dependencies such as pandas and streamlit live in the examples extra, so core development does not require them. Install .[dev,examples] if you want to run the bundled examples.

Quick start

from picable import PiClient, PiClientOptions

options = PiClientOptions(provider="anthropic", model="claude-sonnet-4-20250514")

with PiClient(options) as client:
    events = client.subscribe_events(maxsize=500)
    client.prompt("Reply with exactly: hello")
    while True:
        event = events.get(timeout=30)
        print(event)
        if event.type == "agent_end":
            break

See examples/ for more runnable samples, including the Streamlit dataset triage assistant in examples/dataset_triage/.

API overview

Construction and lifecycle

from picable import PiClient, PiClientOptions

client = PiClient(
    PiClientOptions(
        executable="pi",
        provider="anthropic",
        model="claude-sonnet-4-20250514",
        no_session=False,
        session_dir=None,
        cwd=None,
        env=None,
        startup_timeout=10,
        command_timeout=30,
        idle_timeout=300,
        extra_args=(),
        auto_close_subscriptions=True,
    )
)

# equivalent shorthand:
client = PiClient(provider="anthropic", model="claude-sonnet-4-20250514")

Important lifecycle rules:

  • importing the package does nothing
  • constructing PiClient() does not start Pi
  • the first command starts pi --mode rpc
  • startup_timeout bounds the lazy cold-start readiness probe; on a cold process the client first waits for an internal get_state response before sending your real command
  • after a cold start is ready, the user command still gets its normal command_timeout budget
  • PiClientOptions.env overlays the current process environment instead of replacing it wholesale
  • PiClientOptions.extra_args is appended to the spawned pi --mode rpc argv
  • close() or context-manager exit shuts the subprocess down
  • if idle_timeout is set and expires, the idle subprocess is stopped; the next command starts a fresh subprocess
  • if Pi exits while idle, the next command starts a fresh subprocess
  • if Pi exits during an active workflow, subscribers receive an error and the active run is not replayed
  • auto_close_subscriptions=True closes all subscriptions when the client closes

Commands

The public client mirrors Pi's documented RPC surface:

  • prompt(), steer(), follow_up(), abort()
  • additive convenience helper: continue_prompt() for the recommended immediate streamed follow-up path
  • new_session(), switch_session(), fork()
  • get_state(), get_messages(), get_session_stats()
  • set_model(), cycle_model(), get_available_models()
  • set_thinking_level(), cycle_thinking_level()
  • set_steering_mode(), set_follow_up_mode()
  • compact(), set_auto_compaction()
  • set_auto_retry(), abort_retry()
  • bash(), abort_bash()
  • export_html(), get_fork_messages(), get_last_assistant_text(), set_session_name(), get_commands()
  • respond_extension_ui_value(), respond_extension_ui_confirmed(), respond_extension_ui_cancelled() for RPC-safe extension UI dialogs
  • low-level send_command() when you need direct protocol access

Notable argument details:

  • prompt(), steer(), follow_up(), and continue_prompt() accept optional image content blocks (see picable.protocol_types.ImageContent)
  • prompt() also accepts streaming_behavior="steer" | "followUp"
  • continue_prompt() is the recommended immediate streamed follow-up helper; it sends prompt(..., streaming_behavior="followUp") while keeping the protocol-faithful low-level follow_up() and steer() methods available unchanged
  • in the current verified compatibility suite, raw follow_up() and steer() currently queue pending work in session state instead of starting a fresh streamed turn on their own
  • every high-level command accepts an optional per-call timeout= override
  • send_command() accepts either an explicit picable.commands.RpcCommand or a raw command name plus fields

Event subscriptions

Pi RPC exposes one global process event stream. Because events are not request-scoped, a single PiClient supports only one active agent workflow at a time. If multiple threads or callers need stricter startup/command serialization, coordinate that outside the client.

subscribe_events() returns an EventSubscription: a bounded, queue-like object with get(), drain(), and close().

subscription = client.subscribe_events(maxsize=1000)
event = subscription.get(timeout=5)

subscription.get() also wakes correctly in the default blocking mode: if the client closes, the stream fails, or the subscriber overflows, the blocked caller is released and sees the corresponding exception instead of hanging forever.

Overflow behavior is explicit:

  • each subscriber has its own bounded queue
  • if one subscriber falls behind, that subscription fails with PiSubscriptionOverflowError
  • other subscribers continue unaffected

Strict JSONL framing

RPC mode uses strict JSONL semantics:

  • each record is one JSON object
  • records are delimited by LF (\n) only
  • an optional trailing \r is accepted on input
  • embedded U+2028 and U+2029 inside JSON strings are valid and must not split records

picable uses a byte-oriented reader/writer instead of generic text line readers.

Bash semantics

bash() executes immediately and returns a typed BashResult, but the output reaches the LLM only on the next prompt().

client.bash("ceph status")
client.bash("journalctl -u ceph-mon --no-pager | tail -100")
client.prompt("Analyze the failure using the collected command output")

The stored bash execution message does not emit its own event.

Extension UI support

picable supports the RPC-safe extension UI sub-protocol documented by Pi.

Supported request methods published through subscribe_events() as ExtensionUiRequestEvent values:

  • dialog methods: select, confirm, input, editor
  • fire-and-forget methods: notify, setStatus, setWidget, setTitle, set_editor_text

Dialog methods block Pi until the host responds with one of:

  • respond_extension_ui_value(request_id, value)
  • respond_extension_ui_confirmed(request_id, confirmed=True | False)
  • respond_extension_ui_cancelled(request_id)

Example:

import queue

from picable import PiClient
from picable.events import ExtensionUiRequestEvent
from picable.protocol_types import ConfirmExtensionUiRequest, SelectExtensionUiRequest

with PiClient() as client:
    subscription = client.subscribe_events(maxsize=200)
    saw_extension_ui_request = False

    while True:
        try:
            event = subscription.get(timeout=1 if saw_extension_ui_request else 30)
        except queue.Empty:
            if saw_extension_ui_request:
                break
            raise TimeoutError("Timed out waiting for extension UI events") from None
        if event.type == "agent_end":
            break
        if not isinstance(event, ExtensionUiRequestEvent):
            continue

        saw_extension_ui_request = True
        request = event.request
        if isinstance(request, SelectExtensionUiRequest):
            client.respond_extension_ui_value(request.id, request.options[0])
        elif isinstance(request, ConfirmExtensionUiRequest):
            client.respond_extension_ui_confirmed(request.id, confirmed=True)
        else:
            client.respond_extension_ui_cancelled(request.id)

Extension commands may emit only ExtensionUiRequestEvent records and no agent_end, so hosts should stop on either agent_end or an application-defined idle/completion condition.

Out of scope: TUI-only extension APIs such as ctx.ui.custom() and other direct terminal component hooks that are not carried by the RPC protocol.

Running tests

Unit tests

just test
just lint
just typecheck
just build

Or run just check to execute lint, type checking, unit tests, and build validation together.

Installed-artifact smoke is also available locally:

just install-smoke

Example tests that need pandas are skipped unless you also install .[examples].

Integration tests

Integration tests run a real pi --mode rpc subprocess.

By default, the suite loads a bundled test-only extension at tests/integration/fixtures/mock_provider.ts and selects a canned-response mock model after startup, so external model credentials are not required.

Default requirements:

  • pi on PATH
  • the bundled mock extension fixture present in tests/integration/fixtures/

Run them with:

just test-integration

Or run just check-all to execute the standard checks plus integration tests.

The required integration gate now includes:

  • public-API contract coverage for command dispatch, lifecycle, subscriptions, auto-retry controls, and bash context behavior
  • end-to-end smoke runs for examples/basic_prompt.py, examples/session_flow.py, examples/bash_then_prompt.py, examples/extension_ui.py, and examples/review_gate_ui.py
  • a Streamlit AppTest smoke pass for examples/dataset_triage/app.py

Set PI_RPC_REQUIRE_INTEGRATION=1 when skips are unacceptable, such as CI jobs that are expected to install pi first. In that mode, the suite fails loudly instead of silently skipping when pi or the bundled mock fixture is unavailable.

Optional live-backend override:

  • PI_RPC_PROVIDER=<provider>
  • PI_RPC_MODEL=<model>

When those two variables are set, the generic pi_client fixture starts Pi against that real backend instead of the bundled mock path. The dedicated mock-backed assertions still use the canned-response fixture.

The mock provider can match either an exact last-user prompt or an exact trailing context sequence, which lets the suite assert multi-turn history and “bash output reaches the next prompt” behavior deterministically. If a test sends an unmapped prompt/context, the provider returns a clear [pi-rpc-mock missing canned response] ... sentinel so failures are obvious.

If pi is not installed or the bundled mock fixture is missing, the integration suite skips clearly.

Examples

  • examples/basic_prompt.py
  • examples/session_flow.py
  • examples/bash_then_prompt.py
  • examples/extension_ui.py
  • examples/review_gate_ui.py with examples/extensions/review_gate.ts - a realistic human-approval flow that exercises select, confirm, input, editor, notify, setStatus, setWidget, setTitle, and set_editor_text
  • examples/dataset_triage/ - Streamlit CSV/CSV.gz triage assistant with parse hints, bounded first-N profiling, prompt/transcript download, session HTML export, and Pi follow-ups via continue_prompt() (just dataset-triage bootstraps .venv and installs .[examples])

All shipped examples honor the same optional runtime overrides, which lets the integration suite point them at the bundled mock backend without editing the scripts:

  • PI_RPC_EXAMPLE_PROVIDER
  • PI_RPC_EXAMPLE_MODEL
  • PI_RPC_EXAMPLE_EXTRA_ARGS
  • PI_RPC_EXAMPLE_SESSION_DIR

For example, PI_RPC_EXAMPLE_EXTRA_ARGS='-e /path/to/mock_provider.ts' adds test-only extensions while keeping the script defaults intact.

Compatibility and release workflow

  • compatibility policy: docs/compatibility-policy.md
  • release checklist: docs/release-checklist.md
  • GitHub Actions: .github/workflows/ci.yml runs lint, type checking, unit tests, build validation, installed-wheel smoke, example smoke, dataset-triage app smoke, and the rest of the mock-backed integration suite; .github/workflows/compat-smoke.yml is an opt-in live smoke workflow that runs tests/integration/test_live_smoke.py against one real provider/model pair
  • current CI installs pi on Ubuntu runners with npm install -g @mariozechner/pi-coding-agent; if that upstream install path changes, update both the workflow and docs/compatibility-policy.md

Current limits

  • one active workflow per PiClient
  • synchronous/threaded API only
  • extension UI support is limited to the RPC-safe methods documented by Pi; TUI-only APIs such as ctx.ui.custom() remain out of scope
  • compatibility is enforced by tests and documented support policy, not by a protocol handshake
  • some upstream commands still have behavior quirks; the public docs describe the runtime behavior exercised by the deterministic integration suite rather than assuming every documented RPC command streams identically

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

picable-0.1.0.tar.gz (31.9 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

picable-0.1.0-py3-none-any.whl (29.6 kB view details)

Uploaded Python 3

File details

Details for the file picable-0.1.0.tar.gz.

File metadata

  • Download URL: picable-0.1.0.tar.gz
  • Upload date:
  • Size: 31.9 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.12.3

File hashes

Hashes for picable-0.1.0.tar.gz
Algorithm Hash digest
SHA256 87e624acadb90a8afa7f3e901da804ef885e062bc69076297380a19ad9fbb0e2
MD5 30b849efc3fd7ab420043679dba0af6b
BLAKE2b-256 3f5272f575ac10cfb56eaeeac37a26363ed5384d7e3172537896ec4910919427

See more details on using hashes here.

File details

Details for the file picable-0.1.0-py3-none-any.whl.

File metadata

  • Download URL: picable-0.1.0-py3-none-any.whl
  • Upload date:
  • Size: 29.6 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.12.3

File hashes

Hashes for picable-0.1.0-py3-none-any.whl
Algorithm Hash digest
SHA256 74055c2231cafe1a6d89097d98da4cb7f82cce713512653cf29af33e39ec38f0
MD5 afb788d446c2806ebf87c22a0fe97dde
BLAKE2b-256 8c3efa81d5cc2ee7d63888aad6080bd2c13f176a0303ddd1ba56a0427acd20c6

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page