Skip to main content

Prompt minimizer for LLM evals — shrink prompts to minimal failing inputs using delta debugging

Project description

promptmin

PyPI version License: MIT

Prompt minimizer for LLM evals — shrink a prompt to the smallest input that still reproduces a failure using delta debugging (ddmin).

Why?

  • Fast debugging: minimal repros beat 300-line prompts
  • Cheaper CI: fewer tokens, fewer moving parts
  • Safer iteration: smaller diffs, clearer "what changed"
  • Handles flakiness: stability modes (strict / k-of-n) and --confirm-final

Installation

This package is a Python wrapper that shells out to the promptmin Node CLI.

# 1. Install the CLI
npm install -g promptmin

# 2. Install the Python wrapper
pip install promptmin

Usage

Python API

from promptmin import minimize

result = minimize(
    prompt_path="prompts/support.md",
    config_path="promptmin.config.json",
    target="test:refund_policy_01",
    budget_runs=60
)

print(f"Minimized: {result.minimized_path}")
print(f"Report: {result.report_path}")

CLI

promptmin-py minimize \
  --prompt prompts/support.md \
  --config promptmin.config.json \
  --target test:refund_policy_01 \
  --budget-runs 60

Config example

{
  "runner": {
    "type": "openai_responses",
    "model": "gpt-4.1-mini"
  },
  "tests": [{
    "id": "refund_policy_01",
    "input": { "user": "Can I get a refund?" },
    "assert": { "type": "regex_not_match", "pattern": "competitor" }
  }]
}

Requires OPENAI_API_KEY for the openai_responses runner, or use local_command to run your own eval script.

Artifacts

After minimization, you get:

  • baseline.prompt / minimized.prompt — before and after
  • diff.patch — what was removed
  • report.md / meta.json — summary and metadata

Options

Flag Description
--strategy ddmin (default) or greedy
--granularity sections, blocks, sentences, or lines
--stability-mode strict or kofn for flaky tests
--confirm-final Re-verify the final minimized prompt
--no-trace-output Disable trace logging (for sensitive prompts)

Links

License

MIT

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

promptmin-1.0.2.tar.gz (4.5 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

promptmin-1.0.2-py3-none-any.whl (5.1 kB view details)

Uploaded Python 3

File details

Details for the file promptmin-1.0.2.tar.gz.

File metadata

  • Download URL: promptmin-1.0.2.tar.gz
  • Upload date:
  • Size: 4.5 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.14.0

File hashes

Hashes for promptmin-1.0.2.tar.gz
Algorithm Hash digest
SHA256 8cb4bb315aecb6bf0cbd8c9cd229bb86b7a0542cb00be4802805284469e5fd6d
MD5 2cf4413b6baf2c25b274779129b463f8
BLAKE2b-256 6add468e6e6e1642ce4eeead12ea4a677b7b5990efcb2676921161ae46fcbc4e

See more details on using hashes here.

File details

Details for the file promptmin-1.0.2-py3-none-any.whl.

File metadata

  • Download URL: promptmin-1.0.2-py3-none-any.whl
  • Upload date:
  • Size: 5.1 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.14.0

File hashes

Hashes for promptmin-1.0.2-py3-none-any.whl
Algorithm Hash digest
SHA256 ab527321a7f684ba4879052506bb5f676a6c879ed5de57c6c1627cd40c5a4e9f
MD5 379d0510a876b50f522f861c0e457e1f
BLAKE2b-256 7cf205cc6ab4766817ee21cc8fdae452dad8696535864848ff0c6e7bbdbf1162

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page