Skip to main content

Type-safe prompt template builder for LLM APIs.

Project description

philiprehberger-prompt-builder

Tests PyPI version Last updated

Type-safe prompt template builder for LLM APIs.

Installation

pip install philiprehberger-prompt-builder

Usage

from philiprehberger_prompt_builder import Prompt

messages = (
    Prompt()
    .system("You are a helpful assistant.")
    .user("Summarize this article: {article}")
    .render(article="Long article text here...")
)
# [{"role": "system", "content": "You are a helpful assistant."},
#  {"role": "user", "content": "Summarize this article: Long article text here..."}]

Few-Shot Examples

from philiprehberger_prompt_builder import Prompt

messages = (
    Prompt()
    .system("Classify the sentiment.")
    .example(user="I love this!", assistant="positive")
    .example(user="Terrible product.", assistant="negative")
    .user("{text}")
    .render(text="Pretty good actually")
)

Batch Few-Shot Examples

from philiprehberger_prompt_builder import Prompt

messages = (
    Prompt()
    .system("Translate English to French.")
    .with_examples([
        ("Hello", "Bonjour"),
        ("Goodbye", "Au revoir"),
        ("Thank you", "Merci"),
    ])
    .user("{text}")
    .render(text="Good morning")
)

Output Format Instructions

from philiprehberger_prompt_builder import Prompt

# Request JSON output
messages = (
    Prompt()
    .system("Extract structured data.")
    .user("Parse: {text}")
    .expect_json(description='{"name": string, "age": number}')
    .render(text="John is 30 years old")
)

# Request list output
messages = (
    Prompt()
    .system("Generate ideas.")
    .user("List 5 project ideas about {topic}")
    .expect_list()
    .render(topic="machine learning")
)

Conditional Content

from philiprehberger_prompt_builder import Prompt

use_examples = True

messages = (
    Prompt()
    .system("You are a helpful assistant.")
    .conditional(use_examples, "user", "Here are some examples...")
    .conditional(use_examples, "assistant", "I understand the examples.")
    .user("Now answer my question: {question}")
    .render(question="What is Python?")
)

Prompt Composition

from philiprehberger_prompt_builder import Prompt

preamble = Prompt().system("You are a coding assistant.").user("Use Python 3.12+.")
task = Prompt().user("Write a function that {task}")

combined = preamble.merge(task)
messages = combined.render(task="sorts a list")

Reusable Templates

from philiprehberger_prompt_builder import PromptTemplate

summarizer = PromptTemplate(
    system="You are a {tone} summarizer. Output in {format}.",
    user="Summarize: {content}",
    defaults={"tone": "concise", "format": "bullet points"},
)

messages = summarizer.render(content="Article text...")
messages = summarizer.render(content="...", tone="detailed", format="paragraphs")

verbose = summarizer.extend(tone="thorough", format="essay")

Prompt Versioning

from philiprehberger_prompt_builder import Prompt, PromptVersionStore

store = PromptVersionStore()

v1 = Prompt().system("You are helpful.").user("Answer: {question}")
store.save("v1", v1)

v2 = Prompt().system("You are a concise expert.").user("Answer briefly: {question}")
store.save("v2", v2)

prompt = store.load("v1")
messages = prompt.render(question="What is Python?")

store.list_versions()  # ["v1", "v2"]

Token Estimation

from philiprehberger_prompt_builder import Prompt

prompt = Prompt().system("You are helpful.").user("{text}")
estimated = prompt.estimate_tokens(text="Hello world")

Context-Window Warnings

prompt = Prompt().user("very long input...")

warnings = prompt.warn_if_over(limit=8192)
for w in warnings:
    print("WARN:", w)

Returns warning strings (not exceptions) when the estimated token count is approaching or exceeds the limit, so callers can decide whether to truncate, summarise, or proceed.

API

Function / Class Description
Prompt Fluent builder for constructing LLM message lists
.system(content) Add a system message
.user(content) Add a user message
.assistant(content) Add an assistant message
.message(role, content) Add a message with any role
.example(user, assistant) Add a few-shot example pair
.with_examples(examples) Add multiple few-shot examples from a list of (input, output) tuples
.expect_json(description) Append instruction requesting JSON output
.expect_list(description) Append instruction requesting list output
.conditional(include, role, content) Conditionally add a message if include is truthy
.merge(other) Create a new Prompt combining messages from self and other
.render(**kwargs) Render with variable substitution, returns list of dicts
.render_messages(**kwargs) Render and return Message objects
.estimate_tokens(**kwargs) Approximate token count using word heuristics
.warn_if_over(limit, **kwargs) List warnings when estimated tokens approach or exceed limit
PromptTemplate Reusable prompt template with default values
.extend(**overrides) Create a new template with updated defaults
PromptVersionStore Store and retrieve named prompt versions
.save(name, prompt) Save a prompt snapshot under a name
.load(name) Retrieve a stored prompt version by name
.list_versions() List all stored version names
.delete(name) Delete a stored prompt version
Message A single message with role and content

Development

pip install -e .
python -m pytest tests/ -v

Support

If you find this project useful:

Star the repo

🐛 Report issues

💡 Suggest features

❤️ Sponsor development

🌐 All Open Source Projects

💻 GitHub Profile

🔗 LinkedIn Profile

License

MIT

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

philiprehberger_prompt_builder-0.4.0.tar.gz (11.4 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

File details

Details for the file philiprehberger_prompt_builder-0.4.0.tar.gz.

File metadata

File hashes

Hashes for philiprehberger_prompt_builder-0.4.0.tar.gz
Algorithm Hash digest
SHA256 fd704aa9bf5d5a29dd86f60a5dfeaea0c2ff76cf563b3e7c90fa89d408207baf
MD5 0e9d70e39393273d77f59fdc41f9b261
BLAKE2b-256 fc74c70be743392c1bc3f78eb29f5bd0d8a2d58f39ab01d4a00831e3bb6ff187

See more details on using hashes here.

File details

Details for the file philiprehberger_prompt_builder-0.4.0-py3-none-any.whl.

File metadata

File hashes

Hashes for philiprehberger_prompt_builder-0.4.0-py3-none-any.whl
Algorithm Hash digest
SHA256 87a7cb43d6225c058f961812015180be6e2bc59a072973e10d1f8474fe8f90d6
MD5 85f44ac6bdc33dfe7c7a42b553141ef0
BLAKE2b-256 c5899d87a809bf0537c0c810416ff48b28c29244e46c48a375f34b93a14969b3

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page