Skip to main content

Type-safe prompt template builder for LLM APIs

Project description

philiprehberger-prompt-builder

Tests PyPI version Last updated

Type-safe prompt template builder for LLM APIs.

Installation

pip install philiprehberger-prompt-builder

Usage

Fluent Builder

from philiprehberger_prompt_builder import Prompt

messages = (
    Prompt()
    .system("You are a helpful assistant.")
    .user("Summarize this article: {article}")
    .render(article="Long article text here...")
)
# [{"role": "system", "content": "You are a helpful assistant."},
#  {"role": "user", "content": "Summarize this article: Long article text here..."}]

Few-Shot Examples

messages = (
    Prompt()
    .system("Classify the sentiment.")
    .example(user="I love this!", assistant="positive")
    .example(user="Terrible product.", assistant="negative")
    .user("{text}")
    .render(text="Pretty good actually")
)

Conditional Content

use_examples = True

messages = (
    Prompt()
    .system("You are a helpful assistant.")
    .conditional(use_examples, "user", "Here are some examples...")
    .conditional(use_examples, "assistant", "I understand the examples.")
    .user("Now answer my question: {question}")
    .render(question="What is Python?")
)

Prompt Composition

# Build reusable prompt fragments
preamble = Prompt().system("You are a coding assistant.").user("Use Python 3.12+.")
task = Prompt().user("Write a function that {task}")

# Merge fragments into a single prompt
combined = preamble.merge(task)
messages = combined.render(task="sorts a list")

Reusable Templates

from philiprehberger_prompt_builder import PromptTemplate

summarizer = PromptTemplate(
    system="You are a {tone} summarizer. Output in {format}.",
    user="Summarize: {content}",
    defaults={"tone": "concise", "format": "bullet points"},
)

# Use with defaults
messages = summarizer.render(content="Article text...")

# Override defaults
messages = summarizer.render(content="...", tone="detailed", format="paragraphs")

# Create variant
verbose = summarizer.extend(tone="thorough", format="essay")

Token Estimation

prompt = Prompt().system("...").user("{text}")
estimated = prompt.estimate_tokens(text="Hello world")

API

Function / Class Description
Prompt Fluent builder for constructing LLM message lists
.system(content) Add a system message
.user(content) Add a user message
.assistant(content) Add an assistant message
.message(role, content) Add a message with any role
.example(user, assistant) Add a few-shot example pair
.conditional(include, role, content) Conditionally add a message if include is truthy
.merge(other) Create a new Prompt combining messages from self and other
.render(**kwargs) Render with variable substitution, returns list of dicts
.render_messages(**kwargs) Render and return Message objects
.estimate_tokens(**kwargs) Rough token count (~4 chars/token)
PromptTemplate Reusable prompt template with default values
Message A single message with role and content

Development

pip install -e .
python -m pytest tests/ -v

Support

If you find this project useful:

Star the repo

🐛 Report issues

💡 Suggest features

❤️ Sponsor development

🌐 All Open Source Projects

💻 GitHub Profile

🔗 LinkedIn Profile

License

MIT

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

philiprehberger_prompt_builder-0.2.1.tar.gz (6.3 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

File details

Details for the file philiprehberger_prompt_builder-0.2.1.tar.gz.

File metadata

File hashes

Hashes for philiprehberger_prompt_builder-0.2.1.tar.gz
Algorithm Hash digest
SHA256 8c631136b4baf262c106ace19c70c5c7bf2f228f7e82061e35adff1c21f8db47
MD5 315a3846018fbc86dde406b42d677f99
BLAKE2b-256 597d6625ed96b3584709874f80576eb46c95d17c3bf2578ce8a5fc1633405435

See more details on using hashes here.

File details

Details for the file philiprehberger_prompt_builder-0.2.1-py3-none-any.whl.

File metadata

File hashes

Hashes for philiprehberger_prompt_builder-0.2.1-py3-none-any.whl
Algorithm Hash digest
SHA256 8ac7064cb612ca71bded0f843e8d72f25a1f084fdf01841178c5c20c1da2eae7
MD5 2a98018b43fdbb23ad6479cd3c6e8298
BLAKE2b-256 9d4da337f01cbc987b26b6a00432ad9a4ed0d397eaac41bc66b4dcffc17cf9ee

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page