Universal prompt optimization wrapper for coding CLIs that compresses structured context into Toon-style blocks.
Project description
ToonPrompt
ToonPrompt is a pip-installable wrapper for coding CLIs such as Codex, Claude, Cursor, and Gemini. It preserves natural-language instructions, detects bulky structured context like JSON, YAML, logs, stack traces, and repeated records, then compresses only those sections into a deterministic Toon-style format before forwarding the prompt to the native tool.
This repository is currently in public alpha. The packaging metadata, contribution docs, and release scaffolding are intentionally lightweight so the project can ship early without hiding the rough edges.
Why this exists
Large coding prompts often waste context on repeated keys, punctuation, and machine-shaped payloads. ToonPrompt reduces that overhead while keeping human intent readable and leaving uncertain content unchanged.
Default behavior:
- structured-only rewriting
- fail-open pass-through on uncertainty
- local minimal logs with redacted prompt hashes
- preview on demand
- prompt-only transformation
Installation
From GitHub
pip install git+https://github.com/kmrsandeep1998/ToonPrompt.git
For development
git clone https://github.com/kmrsandeep1998/ToonPrompt.git
cd ToonPrompt
python -m venv .venv
source .venv/bin/activate
pip install -e .[dev]
Quick start
Inspect a prompt without calling a native CLI:
toon inspect --prompt-file prompt.txt --preview --explain
Proxy a supported CLI:
toon codex --prompt-file prompt.txt -- --model gpt-5.4
printf '%s\n' '{"id":1,"name":"node"}' | toon claude --stdin -- --print
toon-cursor --prompt "Explain this stack trace" -- --help
Initialize config:
toon config init
Run diagnostics:
toon doctor
Supported prompt sources
--promptfor inline text--prompt-filefor file-backed prompts--stdinfor piped input
Interactive keystroke capture inside terminal UIs is intentionally out of scope for v0.1.
How transformation works
ToonPrompt builds a prompt document, classifies segments, and only rewrites supported structured sections. Plain-language instructions remain unchanged.
Example JSON input:
{
"nodes": [
{"id": 1, "name": "Node 1"},
{"id": 2, "name": "Node 2"}
]
}
Compressed Toon-style output:
data:
nodes[2]{id,name}:
1,Node 1
2,Node 2
Configuration
Global config path:
~/.config/toonprompt/config.toml
Project override:
.toonprompt.toml
Generate the default config with:
toon config init
Current limitations
- best-effort parity across tools, not identical behavior
- response text is not post-processed
- interactive native TUI input is not intercepted
- Windows support is deferred
Compatibility Matrix
| Tool | Invocation style | Prompt sources | Notes |
|---|---|---|---|
| Codex CLI | toon codex -- ... |
--prompt, --prompt-file, --stdin |
Best-effort wrapper around the native binary. |
| Claude CLI | toon claude -- ... |
--prompt, --prompt-file, --stdin |
Structured prompt compression only; no response rewriting. |
| Cursor CLI | toon cursor -- ... |
--prompt, --prompt-file, --stdin |
Wrapper mode only; interactive TUI keystrokes are not intercepted. |
| Gemini CLI | toon gemini -- ... |
--prompt, --prompt-file, --stdin |
Same baseline behavior as other adapters. |
Toon Format Versioning
ToonPrompt serializes structured segments using toon_format = "1". Version 1 guarantees:
- stable indentation with two-space nesting
- array headers in the form
name[count]: - scalar-record tables in the form
name[count]{field1,field2}: - escaped commas, newlines, and backslashes inside cells
- fail-open pass-through when input cannot be converted safely
Future format versions should be additive and explicitly gated by config so generated prompts remain predictable.
Benchmarks
Sample fixtures for token-savings benchmarking live in benchmarks/fixtures. They cover:
- nested JSON payloads
- repeated application logs
- mixed natural language plus structured context
Run the fixture benchmark script with:
PYTHONPATH=src python3 scripts/benchmark_fixtures.py
Development
Run tests:
pytest
Useful repo docs:
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file toonprompt-0.1.0a1.tar.gz.
File metadata
- Download URL: toonprompt-0.1.0a1.tar.gz
- Upload date:
- Size: 19.0 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.2.0 CPython/3.14.3
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
95684eb3a262fc58769e141c95df98ae9c96ded59143cf3dc30df2363e37161a
|
|
| MD5 |
f3c0c2f6b469753dffb49abfeb9fc5f6
|
|
| BLAKE2b-256 |
13e8ffaba210f6afb483c7523c1f601f815b91cfb896df2cdcb9ba103356ae0a
|
File details
Details for the file toonprompt-0.1.0a1-py3-none-any.whl.
File metadata
- Download URL: toonprompt-0.1.0a1-py3-none-any.whl
- Upload date:
- Size: 18.3 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.2.0 CPython/3.14.3
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
f548515958b9586954222cde0321da3159b95f6dbb77f6040da90482f0e94bc0
|
|
| MD5 |
03d5bbdb22cb2c0ba223d5ed1729bdac
|
|
| BLAKE2b-256 |
a5029781d951e97fc7971553856bc8c75f2b7986efd6c3928d8120c6c367d6d6
|