Skip to main content

Dynamic prompt management and configuration library for LLM applications. Powerful, lazy-loading, and supports Jinja2 templates and Pydantic schemas.

Project description

dynaprompt logo

dynaprompt - Dynamic prompt management and configuration library for LLM applications. Powerful, lazy-loading, and supports Jinja2 templates and Pydantic schemas.

MIT License PyPI Code Style Black Coverage

DynaPrompt is a powerful, lazy-loading prompt configuration manager inspired by Dynaconf. It offers a structured way to manage, version, and render LLM prompts while keeping your templates completely separated from your application logic.


The ProblemFeaturesUsage ExamplesValidation & Hooks


🛑 The Problem: Why DynaPrompt?

Writing LLM apps usually starts simple, but quickly becomes an unmaintainable mess of hardcoded strings, f-strings, and scattered configuration dictionaries.

❌ Without DynaPrompt (The Mess)

import os, json

# Hardcoded, mixed with logic, impossible to swap easily for testing/production
SYSTEM_PROMPT = f"""
You are a helpful assistant.
Current User: {user_name}
Format your output according to this schema:
{json.dumps(MySchema.model_json_schema())}
"""

if os.getenv("ENV") == "production":
    model = "gpt-4"
    temperature = 0.2
else:
    model = "gpt-3.5-turbo"
    temperature = 0.7

response = llm_client.generate(prompt=SYSTEM_PROMPT, model=model, temp=temperature)

✅ With DynaPrompt (Clean & Maintainable)

from dynaprompt import DynaPrompt

# Zero I/O at import. Auto-discovers environments, schemas, and templates.
prompts = DynaPrompt(settings_files=["prompts/"])

# Automatically uses the right model/temp for your current environment!
rendered = prompts.system.render(user_name="Emam")

response = llm_client.generate(
    prompt=rendered.text,
    model=rendered.config["model"],
    temp=rendered.config["temperature"],
    response_format=rendered.response_schema
)

🚀 Installation

# Using pip
pip install dynaprompt

# Using uv (recommended)
uv add dynaprompt

📖 Usage Examples

1. Markdown with YAML Frontmatter (The Cleanest Way)

DynaPrompt allows you to define prompts as standalone Markdown files. You can attach LLM configuration (like model, temperature, or required response_schema) directly at the top of the file using YAML Frontmatter.

prompts/analyzer.md

---
model: gpt-4o
temperature: 0.2
max_tokens: 1000
response_schema: AnalysisSchema
---
You are an expert code analyzer.
Please review the following code snippet from {{ developer_name }}:

\```python
{{ code_snippet }}
\```

Analyze it and return the result strictly matching the schema.

Usage:

prompts = DynaPrompt(settings_files=["prompts/"])

# Renders the Jinja template with your variables
rendered = prompts.analyzer.render(developer_name="Emam", code_snippet="print('hello')")

print(rendered.config["model"]) # "gpt-4o"

2. Environment Layering (Dev vs Prod)

You can define base settings and then override them for specific environments (e.g., development, production). DynaPrompt automatically switches based on ENV_FOR_DYNAPROMPT.

prompts.toml

# Base settings for all environments
[default.summarizer]
template = "Summarize this: {{ text }}"
model = "gpt-3.5-turbo"
temperature = 0.7

# Overrides for production ONLY
[production.summarizer]
model = "gpt-4-turbo"
temperature = 0.1

Usage:

# Default environment
prompts = DynaPrompt(settings_files=["prompts.toml"], env="development")
print(prompts.summarizer.config["model"])  # "gpt-3.5-turbo"

# Switch to production dynamically
with prompts.using_env("production"):
    print(prompts.summarizer.config["model"])  # "gpt-4-turbo"

3. File-Based Templates and Variables

Keep your configuration files pristine. DynaPrompt can automatically resolve templates and variables from external files or Python modules.

[default.customer_service]
# Load text directly from an external markdown file
template = "prompts/customer_service.md"

# Dynamically import a string variable from a Python file!
# (Extracts 'greeting_prompt' from 'config/prompts.py')
fallback_template = "config.prompts.greeting_prompt"

# Merge specific dictionaries or load entire Python modules as global variables
variables = [
    "config/settings.json",
    "myapp.config:constants"
]

4. Auto-Exporting Prompts to TOML

You can automatically export your entire loaded prompt structure into a central pyprompts.toml file. To keep things clean and optimized:

  • Multiline Templates: Saved as separate .md files in a prompts/ directory.
  • TOML: References these files by relative path.
prompts = DynaPrompt(settings_files=["examples/"], auto_export=True)
_ = prompts.google.gemini # Triggers lazy-load and export

🛡️ Validation & Hooks

DynaPrompt allows you to enforce constraints on your rendered prompts (Validation) and intercept the rendering process to inject context automatically (Hooks).

Example: Enforcing Token Limits and Injecting Context

from dynaprompt import DynaPrompt
from dynaprompt.validator import PromptValidator

# 1. Create a Validator to prevent overly long prompts
class TokenLimitValidator(PromptValidator):
    def validate(self, node, rendered) -> None:
        if len(rendered.text.split()) > 2000:
            raise ValueError(f"Prompt '{node.name}' exceeds maximum token length!")

prompts = DynaPrompt(
    settings_files=["prompts/"],
    validators=[TokenLimitValidator()]
)

# 2. Add a Pre-Render Hook to automatically inject the current date into EVERY prompt
def inject_date(node, kwargs):
    from datetime import datetime
    kwargs["current_date"] = datetime.now().strftime("%Y-%m-%d")
    return kwargs

prompts.add_hook("before_render", "inject_date", inject_date)

# 3. Render
# The hook automatically injects `current_date`, and the validator ensures it's safe!
rendered = prompts.system.render(user_name="Emam")

🔍 Inspection & Tab-Completion

DynaPrompt is designed for developer productivity.

  • Tab-Completion: Use dir(prompts) or hit Tab in your IDE to see all available prompts and schemas.
  • History Tracking: Inspect exactly where a prompt was loaded from and how it was merged across layers.
print(prompts.inspect("customer_support"))

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

dynaprompt-0.3.2.tar.gz (1.0 MB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

dynaprompt-0.3.2-py3-none-any.whl (31.3 kB view details)

Uploaded Python 3

File details

Details for the file dynaprompt-0.3.2.tar.gz.

File metadata

  • Download URL: dynaprompt-0.3.2.tar.gz
  • Upload date:
  • Size: 1.0 MB
  • Tags: Source
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: uv/0.11.12 {"installer":{"name":"uv","version":"0.11.12","subcommand":["publish"]},"python":null,"implementation":{"name":null,"version":null},"distro":{"name":"Ubuntu","version":"24.04","id":"noble","libc":null},"system":{"name":null,"release":null},"cpu":null,"openssl_version":null,"setuptools_version":null,"rustc_version":null,"ci":true}

File hashes

Hashes for dynaprompt-0.3.2.tar.gz
Algorithm Hash digest
SHA256 3a3ae8e13e49242109651bb4998f42219196188bc7f79500f28c1ce3c3c47d71
MD5 918d4b67f8d7afe1a88a2c47802224a2
BLAKE2b-256 85b66f0416ceae4f66f9200aaa4d96b769dc027a46384a5f26d508546f38b6d9

See more details on using hashes here.

File details

Details for the file dynaprompt-0.3.2-py3-none-any.whl.

File metadata

  • Download URL: dynaprompt-0.3.2-py3-none-any.whl
  • Upload date:
  • Size: 31.3 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: uv/0.11.12 {"installer":{"name":"uv","version":"0.11.12","subcommand":["publish"]},"python":null,"implementation":{"name":null,"version":null},"distro":{"name":"Ubuntu","version":"24.04","id":"noble","libc":null},"system":{"name":null,"release":null},"cpu":null,"openssl_version":null,"setuptools_version":null,"rustc_version":null,"ci":true}

File hashes

Hashes for dynaprompt-0.3.2-py3-none-any.whl
Algorithm Hash digest
SHA256 2735da16d4a853c57d3ad591119b3b8d801cf4eb6a328c843860c9dfc1099277
MD5 9b3a191cbd75a2c016ea52ca1f3909b5
BLAKE2b-256 fb6ff4e4168d7ad836df56496b845d993ebc8894198e50e29680094a7abc28c8

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page