Skip to main content

A package for loading promptdown files, which are a special type of markdown file for defining structured LLM prompts

Project description

Promptdown

Promptdown banner

Build Status Supports Python versions 3.10+

Promptdown is a Python package that allows you to express structured prompts for language models in a markdown format. It provides a simple and intuitive way to define and manage prompts, making it easier to work with language models in your projects.

Installation

Using PDM

Promptdown can be installed using PDM:

pdm add promptdown

Using pip

Alternatively, you can install Promptdown using pip:

pip install promptdown

Usage

Basic Usage

To use Promptdown, simply create a Promptdown file (.prompt.md) with the following format. You can use either a System Message or a Developer Message (for newer model APIs), but not both:

# My Prompt

## System Message

You are a helpful assistant.

## Conversation

**User:**
Hi, can you help me?

**Assistant:**
Of course! What do you need assistance with?

**User:**
I'm having trouble with my code.

**Assistant:**
I'd be happy to help. What seems to be the problem?

Or alternatively:

# My Prompt

## Developer Message

You are a helpful assistant.

## Conversation

**User:**
Hi, can you help me?

**Assistant:**
Of course! What do you need assistance with?

**User:**
I'm having trouble with my code.

**Assistant:**
I'd be happy to help. What seems to be the problem?

Then, you can parse this file into a StructuredPrompt object using Promptdown:

from promptdown import StructuredPrompt

structured_prompt = StructuredPrompt.from_promptdown_file('path/to/your_prompt_file.prompt.md')
print(structured_prompt)

Please note that:

  • The Conversation section can be omitted
  • Either a System Message or Developer Message section is required, but not both
  • Use Developer Message for newer model APIs (like OpenAI's o1) that expect the "developer" role instead of "system"
  • Conversations use a simplified format where roles are marked with bold text (**User:** or **Assistant:**, optionally **Role (Name):** to include a name). Only User and Assistant roles are recognized; other roles are ignored with a warning, and conversation roles cannot be system or developer.
  • Conversation message lines are collapsed into a single line with spaces (blank lines are dropped).

Parsing a Prompt from a String

For scenarios where you have the prompt data as a string (perhaps dynamically generated or retrieved from an external source), you can parse it directly:

from promptdown import StructuredPrompt

promptdown_string = """
# My Prompt

## Developer Message

You are a helpful assistant.

## Conversation

**User:**
Hi, can you help me?

**Assistant:**
Of course! What do you need assistance with?

**User:**
I'm having trouble with my code.

**Assistant:**
I'd be happy to help. What seems to be the problem?
"""

structured_prompt = StructuredPrompt.from_promptdown_string(promptdown_string)
print(structured_prompt)

Converting to Chat Completion Messages

The to_chat_completion_messages method converts a StructuredPrompt instance into a list of dictionaries suitable for chat completion API clients. The returned list includes the system or developer message first, followed by the conversation messages. This is useful when you need to send the structured conversation to an API that expects messages in a specific format. Here's an example of how to use this method:

from promptdown import StructuredPrompt

promptdown_string = """
# My Prompt

## Developer Message

You are a helpful assistant.

## Conversation

**User:**
Hi, can you help me?

**Assistant:**
Of course! What do you need assistance with?

**User:**
I'm having trouble with my code.
"""

structured_prompt = StructuredPrompt.from_promptdown_string(promptdown_string)
messages_from_promptdown = structured_prompt.to_chat_completion_messages()

response = client.chat.completions.create(
    model="gpt-4o",
    messages=messages_from_promptdown,
    temperature=0.7,
    max_tokens=300,
)

Converting to OpenAI Responses input

For the OpenAI Responses API, use to_responses_input() to emit messages in the expected format. Any prior system content is mapped to the developer role by default for consistency with newer models; you can disable this via map_system_to_developer=False.

from promptdown import StructuredPrompt

structured_prompt = StructuredPrompt.from_promptdown_string(promptdown_string)
responses_input = structured_prompt.to_responses_input()

# Example with the OpenAI SDK (Responses API)
from openai import OpenAI
client = OpenAI()

result = client.responses.create(
    model="gpt-5",
    input=responses_input,
    reasoning={"effort": "medium"},
)

Notes:

  • to_responses_input() outputs a list of messages, each { "role": "<role>", "content": [{"type": "input_text", "text": "..."}] }.
  • Content is always non-null; non-strings are coerced with str(...).
  • Already-structured input_text parts are passed through; other shapes are coerced to text.
  • Current scope focuses on text parts; additional types (images/tools) can be added in the future.

If you have legacy Chat Completions-style messages and want to convert them to Responses input, a convenience converter is available:

from promptdown.converters import convert_chat_messages_to_responses_input

legacy_messages = [
    {"role": "system", "content": "You are helpful."},
    {"role": "user", "content": [{"type": "text", "text": "Hello"}]},
]
responses_messages = convert_chat_messages_to_responses_input(legacy_messages)

Loading Prompts from Package Resources

For applications where prompts are bundled within Python packages, Promptdown can load prompts directly from these resources. This approach is useful for distributing prompts alongside Python libraries or applications:

from promptdown import StructuredPrompt

structured_prompt = StructuredPrompt.from_package_resource('your_package', 'your_prompt_file.prompt.md')
print(structured_prompt)

This method facilitates easy management of prompts within a package, ensuring that they can be versioned, shared, and reused effectively.

Using Template Strings

Promptdown supports the use of template strings within your prompts, allowing for dynamic customization of both system messages and conversation content. This feature is particularly useful when you need to tailor prompts based on specific contexts or user data.

Defining Template Strings

To incorporate template strings in your Promptdown files, use curly braces {variable} around placeholders that you intend to replace dynamically. Here is an example of how to use template strings in a prompt:

# My Prompt

## Developer Message

You are a helpful assistant in {topic}.

## Conversation

**User:**
Hi, can you help me with {topic}?

**Assistant:**
Of course! What specifically do you need help with in {topic}?

**User:**
I'm having trouble understanding {concept}.

**Assistant:**
No problem! Let's dive into {concept} together.

Applying Template Values

Once you have defined a prompt with placeholders, you can replace these placeholders by passing a dictionary of template values to the apply_template_values method. Here's how you can apply template values to your prompt:

from promptdown import StructuredPrompt

# Load your structured prompt from a file or string that contains template placeholders
structured_prompt = StructuredPrompt.from_promptdown_string(promptdown_string)

# Define the template values to apply
template_values = {
    "topic": "Python programming",
    "concept": "decorators"
}

# Apply the template values (returns a new StructuredPrompt)
new_prompt = structured_prompt.apply_template_values(template_values)

# Output the updated prompt
print(new_prompt)

This returns a new prompt where {topic} becomes "Python programming" and {concept} becomes "decorators" in the system message and conversation content; the original structured_prompt stays unchanged. Template values are not applied inside triple-backtick code blocks. Using template strings in Promptdown allows for more flexible and context-sensitive interactions with language models.

Contributing

Contributions are welcome! Feel free to open an issue or submit a pull request.

License

Promptdown is released under the MIT License.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

promptdown-1.1.6.tar.gz (13.1 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

promptdown-1.1.6-py3-none-any.whl (12.0 kB view details)

Uploaded Python 3

File details

Details for the file promptdown-1.1.6.tar.gz.

File metadata

  • Download URL: promptdown-1.1.6.tar.gz
  • Upload date:
  • Size: 13.1 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.13.7

File hashes

Hashes for promptdown-1.1.6.tar.gz
Algorithm Hash digest
SHA256 e7bf42e5068d149600ffcfbdd0bbeed6375dbd6040d4ea8e1c40a6a1c5602c77
MD5 4d1ab6cff959c0c0d99ef28bc14fa273
BLAKE2b-256 29552613a5c9e1ddef7bc9eaf517da80ed74280ccbd6bba63da63cf41ee0da87

See more details on using hashes here.

Provenance

The following attestation bundles were made for promptdown-1.1.6.tar.gz:

Publisher: python-publish.yml on btfranklin/promptdown

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file promptdown-1.1.6-py3-none-any.whl.

File metadata

  • Download URL: promptdown-1.1.6-py3-none-any.whl
  • Upload date:
  • Size: 12.0 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.13.7

File hashes

Hashes for promptdown-1.1.6-py3-none-any.whl
Algorithm Hash digest
SHA256 897ee566d6e99bbd3f39e4c89ce35466ec1a266c19da1efad69b3500ad1abdfd
MD5 a879db6c9eb36f82bad08aa3fe33218b
BLAKE2b-256 1df0f2769f8c513deffca1a6be2e2cf69c1b1e1f21edf05d6cbe59abe224adfe

See more details on using hashes here.

Provenance

The following attestation bundles were made for promptdown-1.1.6-py3-none-any.whl:

Publisher: python-publish.yml on btfranklin/promptdown

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page