Runtime prompt control and versioning for production apps
Project description
PromptFoundry
Runtime prompt control for production apps. Change a prompt in the UI, see it reflected in your running app within seconds — no redeployment needed.
Install
pip install promptfoundry
Quick Start
from fastapi import FastAPI
from promptfoundry import PromptManager, aget_prompt_with_meta, log_prompt_usage
# 1. Initialize once at startup
manager = PromptManager(
db_path="prompts.db",
cache_ttl=5,
)
app = FastAPI()
# 2. Mount the UI
app.mount("/prompts", manager.mount_ui())
# 3. Use prompts in your routes
@app.get("/run")
async def run(text: str = "hello"):
meta = await aget_prompt_with_meta("my_prompt")
output = your_llm(meta["content"], text) # your LLM call
# 4. Log usage with the correct version — no hardcoding
log_prompt_usage("my_prompt", meta["version_id"], input_text=text, output_text=output)
return {"output": output}
Visit http://localhost:8000/prompts/list to manage prompts.
API Reference
aget_prompt(name) -> str
Returns the active version content for the named prompt. Use inside async functions (FastAPI routes).
from promptfoundry import aget_prompt
prompt = await aget_prompt("my_prompt")
aget_prompt_with_meta(name) -> dict
Returns both the content and the version_id of the active prompt. Preferred when you need to log usage accurately — no hardcoded IDs.
from promptfoundry import aget_prompt_with_meta
meta = await aget_prompt_with_meta("my_prompt")
# meta = {"content": "...", "version_id": 3}
output = your_llm(meta["content"], user_input)
log_prompt_usage("my_prompt", meta["version_id"], input_text=user_input, output_text=output)
get_prompt(name) -> str
Sync version. Works in plain scripts outside an event loop. Inside FastAPI routes, use aget_prompt or aget_prompt_with_meta instead.
from promptfoundry import get_prompt
prompt = get_prompt("my_prompt")
log_prompt_usage(name, version_id, input_text, output_text)
Writes a usage log entry to the prompt_logs table. Async, non-blocking — safe to call from sync or async code. Failures are silently swallowed and never propagate to the caller.
from promptfoundry import log_prompt_usage
log_prompt_usage("my_prompt", meta["version_id"], input_text=text, output_text=output)
Logs are viewable in the UI at /prompts/logs, filterable by prompt name.
UI Pages
Once mounted, the UI is available at your mount prefix (e.g. /prompts):
| Route | Description |
|---|---|
/prompts/list |
All prompts with active version, last editor, last updated |
/prompts/detail/{name} |
Version history, make active, rollback |
/prompts/edit/{name} |
Edit prompt, create new version, AI suggestion |
/prompts/edit/__new__ |
Create a new prompt |
/prompts/diff/{name} |
Line-by-line diff between any two versions |
/prompts/test/{name} |
A/B test two versions side by side |
/prompts/logs |
Usage log viewer, filterable by prompt name |
LLM Suggestions
Add LLM config to PromptManager and the "✨ Get AI Suggestion" button appears automatically on every edit page. Works with any OpenAI-compatible endpoint.
manager = PromptManager(
llm_url="https://api.openai.com/v1/chat/completions",
llm_api_key="sk-...",
llm_model="gpt-4o",
)
To override the system prompt used for suggestions:
manager = PromptManager(
llm_url="...",
llm_api_key="...",
llm_suggester_prompt="You are an expert at writing concise RAG system prompts. Return only the improved prompt.",
)
If llm_url is not set, the suggestion button and A/B test panel are hidden automatically.
Protected Mode
Require a password to set active versions or assign the prod tag.
manager = PromptManager(
protected_mode=True,
admin_password="your-password",
)
No sessions or tokens — a simple password check per action. Wrong password re-renders the form with an error.
Version Tagging
Versions can be tagged prod, staging, or experiment.
- Only one
prodtag is active per prompt at a time — assigning it removes the tag from the previous version automatically. - In protected mode, assigning
prodor setting a version active requires the admin password.
Resilience
If the database is unreachable, get_prompt / aget_prompt serve the last cached value and log a warning. Your app never crashes due to a DB failure. If there is no cached value and the DB is down, a PromptNotFoundError is raised.
Config Reference
| Parameter | Type | Default | Description |
|---|---|---|---|
db_path |
str |
"prompts.db" |
SQLite file path (auto-created) |
cache_ttl |
int |
5 |
Seconds between cache refreshes |
protected_mode |
bool |
False |
Require password for prod actions |
admin_password |
str |
None |
Required if protected_mode=True |
log_sample_rate |
float |
1.0 |
Fraction of usages to log (0.0–1.0) |
llm_url |
str |
None |
OpenAI-compatible chat completions endpoint |
llm_api_key |
str |
None |
Bearer token for LLM API |
llm_model |
str |
"gpt-3.5-turbo" |
Model name |
llm_suggester_prompt |
str |
built-in | System prompt used by the AI suggester |
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file promptfoundry-0.1.1.tar.gz.
File metadata
- Download URL: promptfoundry-0.1.1.tar.gz
- Upload date:
- Size: 20.8 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.2.0 CPython/3.11.0
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
cd779983e6164ec6d89b91afa6f06b49954fb7669c45239b14cae7f6587592a5
|
|
| MD5 |
d1f31f29e2d986ac755b662c4a65d90d
|
|
| BLAKE2b-256 |
d82e5bbb1c84818cd23d7b571eb427871a6da25db9d1c6249b38ebb76389642c
|
File details
Details for the file promptfoundry-0.1.1-py3-none-any.whl.
File metadata
- Download URL: promptfoundry-0.1.1-py3-none-any.whl
- Upload date:
- Size: 21.6 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.2.0 CPython/3.11.0
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
4e57be7599c2050e0bc4c49ef47d37cd3532e4bb869176b100d36b045d74b89b
|
|
| MD5 |
aadb811d6c45596845374b4516f4cfee
|
|
| BLAKE2b-256 |
b1d661f5aa47cbbfe882633e1e799256a6b2470cd2a3a44ede11f1316b2bf4d1
|