Production-grade prompt optimization middleware for SDK, CLI, and API use cases.
Project description
Prompt Engine
Prompt Engine is a production-ready prompt optimization middleware that takes raw prompts, decomposes intent, applies skills, assembles a higher-signal prompt, and optionally sends it to an LLM provider.
It is designed to work as:
- an MCP server for Cursor, Claude Code, and other agents
- a Python SDK
- a FastAPI service
- a CLI for developers and AI agents
- a pluggable prompt transformation platform
Core capabilities
- Intent detection with rule-based classification and an LLM-classifier extension point
- Prompt decomposition into goal, constraints, expected output, and missing information
- Skill-driven prompt enrichment and conflict-aware ordering
- Structured output enforcement and hallucination guards
- Provider-agnostic execution with retry, timeout, fallback, and stream interfaces
- MCP tools and prompts over
stdioand Streamable HTTP - Prompt diffs, heuristic optimization scoring, tracing, and Prometheus metrics
- Eval harness that runs in CI when skills or prompt logic change
Project structure
backend/prompt_engine/
api/ FastAPI application, schemas, rate limiting
core/ prompt decomposition and compilation
engine/ intent detection and orchestration
evals/ evaluation runner and scoring logic
mcp/ MCP server, tools, prompts, and HTTP app
observability/ tracing and metrics
pipeline/ skill execution pipeline
providers/ mock, OpenAI, Anthropic, OpenRouter adapters
skills/ skill contracts, registry, built-in skills
config/
config.yaml default runtime configuration
evals/
test_cases.json sample evaluation corpus
runner.py top-level evaluation entrypoint
scoring.py scoring entrypoint
sdk/
python/ usage examples for the Python SDK
typescript/ API client SDK
examples/
cursor/ Cursor MCP config examples
claude-code/ Claude Code MCP setup notes
skills/
README.md external plugin authoring guide
tests/
unit and API smoke tests
extensions/
vscode-prompt-engine/ VS Code and Cursor extension
chrome-prompt-engine/ Chrome extension for browser prompt rewriting
Quick start
python3 -m venv .venv
source .venv/bin/activate
pip install -e ".[dev]"
vibe-prompt-engine run "Extract the customer objections from this transcript and return JSON."
uvicorn prompt_engine.api.app:app --reload
vibe-prompt-engine mcp cursor-config --transport stdio
python evals/runner.py
Extensions
Prompt Engine now ships with first-party extension clients:
- VS Code/Cursor extension in extensions/vscode-prompt-engine
- Chrome extension in extensions/chrome-prompt-engine
VS Code or Cursor local test:
cd extensions/vscode-prompt-engine
npm install
npm run build
npm run package
Then install vibe-prompt-engine-vscode-0.1.5.vsix from:
- VS Code:
Extensions: Install from VSIX... - Cursor:
Extensions: Install from VSIX...
Chrome local test:
- Open
chrome://extensions - Enable Developer Mode
- Click
Load unpacked - Select
extensions/chrome-prompt-engine
The Chrome extension now injects a persistent inline toolbar beside ChatGPT-style prompt fields, with Optimize and Tune controls for prompt style settings, and rewrites the current prompt in place without opening the popup.
Both clients default to the hosted production endpoint:
https://prompt-engine-mcp-pewnieev4a-el.a.run.app
MCP quick start
Local stdio server for Cursor:
uvx vibe-prompt-engine mcp stdio
Local or hosted Streamable HTTP server:
vibe-prompt-engine mcp http --host 127.0.0.1 --port 8001 --path /mcp
The FastAPI app also mounts MCP at /mcp, so this works too:
uvicorn prompt_engine.api.app:app --reload --host 0.0.0.0 --port 8000
Then point Cursor to:
command: vibe-prompt-engineargs: ["mcp", "stdio"]
Or point Cursor to:
url: http://127.0.0.1:8000/mcp
You can print ready-to-paste Cursor config:
vibe-prompt-engine mcp cursor-config --transport stdio
vibe-prompt-engine mcp cursor-config --transport http --url http://127.0.0.1:8000/mcp
Public release strategy
For real public adoption, ship Prompt Engine in two channels:
PyPIfor localstdioMCP installs viauvxorpipx- Hosted Streamable HTTP MCP at
https://api.your-domain.com/mcp
Recommended rollout:
- Publish Python package to PyPI
- Publish container image to GHCR
- Deploy hosted image to Cloud Run
- Give users either:
- local config using
uvx vibe-prompt-engine mcp stdio - or hosted config using
https://api.your-domain.com/mcp
- local config using
Until PyPI trusted publishing is configured, users can install directly from GitHub:
uvx --from git+https://github.com/adaline-ankit/prompt-engine vibe-prompt-engine mcp stdio
Or with pipx:
pipx install git+https://github.com/adaline-ankit/prompt-engine.git
vibe-prompt-engine mcp stdio
Release automation and deployment notes are in RELEASE.md and deploy/cloudrun/README.md.
Repository
- GitHub: adaline-ankit/prompt-engine
- Package: vibe-prompt-engine on PyPI
CLI examples
vibe-prompt-engine run "Refactor this Python function to reduce branching."
vibe-prompt-engine run "Classify each support ticket by severity and return JSON." --json
vibe-prompt-engine run "Summarize this design doc" --run-llm --provider mock --stream
vibe-prompt-engine mcp http --host 127.0.0.1 --port 8001
API examples
Optimize:
curl -X POST http://localhost:8000/optimize \
-H "Content-Type: application/json" \
-d '{
"prompt": "Extract the account owner, ARR, and renewal date from this note.",
"context": {
"output_schema": {
"type": "object",
"properties": {
"account_owner": {"type": "string"},
"arr": {"type": "number"},
"renewal_date": {"type": "string"}
},
"required": ["account_owner", "arr", "renewal_date"]
}
}
}'
Run against a provider:
curl -X POST http://localhost:8000/run \
-H "Content-Type: application/json" \
-d '{
"prompt": "Summarize this incident report in bullet points.",
"provider": "mock",
"model": "mock-gpt"
}'
MCP over HTTP is available at:
http://localhost:8000/mcp
SDK examples
Python:
from prompt_engine import PromptOptimizationEngine
engine = PromptOptimizationEngine()
result = engine.optimize_prompt(
"Extract product names, pricing, and discount terms from this email.",
context={"output_schema": {"type": "object"}},
)
print(result.final_prompt)
TypeScript:
import { PromptEngineClient } from "./sdk/typescript/src";
const client = new PromptEngineClient({ baseUrl: "http://localhost:8000" });
const result = await client.optimize({
prompt: "Classify these support tickets by severity.",
});
console.log(result.skills_applied);
Providers
mockworks locally with no API key and is used in tests and examples.openai,anthropic, andopenrouterare implemented behind a common provider contract.- Set
OPENAI_API_KEY,ANTHROPIC_API_KEY, orOPENROUTER_API_KEYto enable remote execution.
Plugins
Prompt Engine loads built-in skills and can load external Python skill modules from paths listed in config/config.yaml.
See skills/README.md for the contract.
MCP tools
Prompt Engine exposes these MCP tools:
optimize_promptoptimize_and_runexplain_transformationssuggest_output_schemalist_skills
It also exposes MCP prompts:
optimize_taskextract_json
Deployment
- Local development via CLI or
uvicorn - Containerized deployment via
Dockerfileanddocker-compose.yml - Cloud-ready for ECS, EKS, Cloud Run, Railway, or any ASGI-compatible platform
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file vibe_prompt_engine-0.1.5.tar.gz.
File metadata
- Download URL: vibe_prompt_engine-0.1.5.tar.gz
- Upload date:
- Size: 41.3 kB
- Tags: Source
- Uploaded using Trusted Publishing? Yes
- Uploaded via: twine/6.1.0 CPython/3.13.7
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
040d77958b0e46235e3d3567df9f8912eefb811d5a63a4253e34ca9ccd589cdf
|
|
| MD5 |
1e9c6ff81e5dc9bcbb72f7662bc90ec0
|
|
| BLAKE2b-256 |
45f572e3d185415caf41089f9422760a78553ffc5a6ce6d55ddacfeb252f75d6
|
Provenance
The following attestation bundles were made for vibe_prompt_engine-0.1.5.tar.gz:
Publisher:
release.yml on adaline-ankit/prompt-engine
-
Statement:
-
Statement type:
https://in-toto.io/Statement/v1 -
Predicate type:
https://docs.pypi.org/attestations/publish/v1 -
Subject name:
vibe_prompt_engine-0.1.5.tar.gz -
Subject digest:
040d77958b0e46235e3d3567df9f8912eefb811d5a63a4253e34ca9ccd589cdf - Sigstore transparency entry: 1146793225
- Sigstore integration time:
-
Permalink:
adaline-ankit/prompt-engine@96049b5321373a6a0f902d0fb0d43bf229eb1245 -
Branch / Tag:
refs/tags/v0.1.5 - Owner: https://github.com/adaline-ankit
-
Access:
public
-
Token Issuer:
https://token.actions.githubusercontent.com -
Runner Environment:
github-hosted -
Publication workflow:
release.yml@96049b5321373a6a0f902d0fb0d43bf229eb1245 -
Trigger Event:
push
-
Statement type:
File details
Details for the file vibe_prompt_engine-0.1.5-py3-none-any.whl.
File metadata
- Download URL: vibe_prompt_engine-0.1.5-py3-none-any.whl
- Upload date:
- Size: 48.4 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? Yes
- Uploaded via: twine/6.1.0 CPython/3.13.7
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
cdcc09192f684d780d656b19627a9d6a7b472bf37e0ab853a7b34497a47062bf
|
|
| MD5 |
a2fed2689c1c760e2f3447c5ae266f4c
|
|
| BLAKE2b-256 |
5e24525d4ec8356544e7a84946a96e27be9aec4e8e067911f90a7b0a09170f4f
|
Provenance
The following attestation bundles were made for vibe_prompt_engine-0.1.5-py3-none-any.whl:
Publisher:
release.yml on adaline-ankit/prompt-engine
-
Statement:
-
Statement type:
https://in-toto.io/Statement/v1 -
Predicate type:
https://docs.pypi.org/attestations/publish/v1 -
Subject name:
vibe_prompt_engine-0.1.5-py3-none-any.whl -
Subject digest:
cdcc09192f684d780d656b19627a9d6a7b472bf37e0ab853a7b34497a47062bf - Sigstore transparency entry: 1146793277
- Sigstore integration time:
-
Permalink:
adaline-ankit/prompt-engine@96049b5321373a6a0f902d0fb0d43bf229eb1245 -
Branch / Tag:
refs/tags/v0.1.5 - Owner: https://github.com/adaline-ankit
-
Access:
public
-
Token Issuer:
https://token.actions.githubusercontent.com -
Runner Environment:
github-hosted -
Publication workflow:
release.yml@96049b5321373a6a0f902d0fb0d43bf229eb1245 -
Trigger Event:
push
-
Statement type: