Dotpromptz is a language-neutral executable prompt template file format for Generative AI.
Project description
LLM Adapters & CLI (Fork Extension)
This fork adds LLM API adapters and a command-line tool so you can run .prompt files directly against OpenAI, Anthropic, Google Gemini, or any third-party compatible service (all adapters support custom base_url).
Quick Start (CLI)
# Install (all adapters included by default)
uv add "dotpromptz-py"
# Set API key
export OPENAI_API_KEY="sk-..."
# Run a prompt (input configured in frontmatter)
runprompt my_prompt.prompt
# Batch mode (auto-detected from frontmatter input)
runprompt my_prompt.prompt
Adapter and model are configured in the .prompt file frontmatter:
---
adapter: openai # optional — auto-inferred from config.model name
config:
model: gpt-4o
runtime:
max_workers: 10 # Concurrent workers for batch processing
output:
output_dir: ./results # Optional directory for output files
jsonl: true # Output in JSONL format
input:
topic: "AI"
---
Tell me about {{topic}}.
For third-party compatible services (e.g. DeepSeek, Ollama):
---
adapter:
name: openai
base_url: https://api.deepseek.com
config:
model: deepseek-chat
---
Tell me about {{topic}}.
Input Formats
Input data is configured in the .prompt file frontmatter using the input field. The CLI no longer takes a separate input file argument.
1. Inline data (single record):
---
input:
name: "Alice"
age: 30
---
Hello {{name}}!
2. File reference (auto-detects single vs batch):
---
input: "data.json" # Relative to .prompt file
---
Process {{field1}} and {{field2}}.
3. Batch mode (list of records):
---
input:
- {name: "Alice", age: 30}
- {name: "Bob", age: 25}
---
Hello {{name}}!
4. JSONL file (always batch):
---
input: "batch.jsonl"
---
Process {{item}}.
File paths: All relative file paths are resolved relative to the .prompt file directory. Absolute paths are also supported.
Quick Start (Python)
import asyncio
from dotpromptz import Dotprompt
from dotpromptz.typing import DataArgument
from dotpromptz.adapters import get_adapter
async def main():
dp = Dotprompt()
# Option 1: Use frontmatter input (auto-loaded from .prompt file)
rendered = dp.render(source)
# Option 2: Override with caller data
# rendered = dp.render(source, data=DataArgument(input={"topic": "AI"}))
adapter = get_adapter("openai")
response = await adapter.generate(rendered)
print(response.text)
asyncio.run(main())
Supported Adapters
| Adapter | Env Var |
|---|---|
| OpenAI | OPENAI_API_KEY |
| Anthropic | ANTHROPIC_API_KEY |
| Google Gemini | GOOGLE_API_KEY |
All adapters and their SDK dependencies (openai, anthropic, google-genai) are included as core dependencies — no extras needed.
All adapters support base_url for third-party compatible endpoints (e.g. DeepSeek, vLLM, Ollama). Configure via frontmatter adapter.base_url or env vars (OPENAI_BASE_URL / ANTHROPIC_BASE_URL / GOOGLE_BASE_URL).
Image Generation (Google Gemini)
Dotprompt supports native image generation via Gemini's generateContent API with response_modalities=["IMAGE"]. To use it, set output.format: image and output.save_path in the frontmatter.
Text-to-image example (draw_cat.prompt):
---
adapter: google
config:
model: gemini-2.0-flash-exp
output:
format: image
save_path: output/cat.png
input:
style: "watercolor"
---
Draw a {{style}} cat sitting on a windowsill.
runprompt draw_cat.prompt
# → Image saved to: output/cat.png
Image-to-image example (using {{media}} helper for input):
---
adapter: google
config:
model: gemini-2.0-flash-exp
output:
format: image
save_path: output/edited.png
input:
image_url: "https://example.com/photo.jpg"
instruction: "Make it black and white"
---
{{media url=image_url}}
{{instruction}}
runprompt edit_image.prompt
Notes:
output.save_pathis required whenformatisimage. Omitting it raises a validation error at parse time.save_pathis validated against path traversal attacks (e.g.../../etc/passwd) — only paths within the current working directory are allowed.- Currently only the Google adapter supports image generation. Only the first generated image is saved.
- The parent directory of
save_pathis created automatically if it does not exist.
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file dotpromptz_py-1.4.0.tar.gz.
File metadata
- Download URL: dotpromptz_py-1.4.0.tar.gz
- Upload date:
- Size: 162.0 kB
- Tags: Source
- Uploaded using Trusted Publishing? Yes
- Uploaded via: twine/6.1.0 CPython/3.13.7
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
62dd5113b8396677e834ba5439dae5425e17684fc0403dd38ec28c0e54a976d5
|
|
| MD5 |
ffa46530b17e0d0f6a9fd29aaebc8538
|
|
| BLAKE2b-256 |
fe1ffc03a9bb9e66caefb9050f087d78e314d79a4d78a82d2e226617a52c88d4
|
Provenance
The following attestation bundles were made for dotpromptz_py-1.4.0.tar.gz:
Publisher:
publish-pypi.yml on my-three-kingdoms/dotpromptz
-
Statement:
-
Statement type:
https://in-toto.io/Statement/v1 -
Predicate type:
https://docs.pypi.org/attestations/publish/v1 -
Subject name:
dotpromptz_py-1.4.0.tar.gz -
Subject digest:
62dd5113b8396677e834ba5439dae5425e17684fc0403dd38ec28c0e54a976d5 - Sigstore transparency entry: 1004809297
- Sigstore integration time:
-
Permalink:
my-three-kingdoms/dotpromptz@af1130f1e1bf67f9c445bcd34fd8cf8db699f2fe -
Branch / Tag:
refs/heads/main - Owner: https://github.com/my-three-kingdoms
-
Access:
private
-
Token Issuer:
https://token.actions.githubusercontent.com -
Runner Environment:
github-hosted -
Publication workflow:
publish-pypi.yml@af1130f1e1bf67f9c445bcd34fd8cf8db699f2fe -
Trigger Event:
workflow_dispatch
-
Statement type:
File details
Details for the file dotpromptz_py-1.4.0-py3-none-any.whl.
File metadata
- Download URL: dotpromptz_py-1.4.0-py3-none-any.whl
- Upload date:
- Size: 52.9 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? Yes
- Uploaded via: twine/6.1.0 CPython/3.13.7
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
8adcd9c445836bc363e5bcf3e389e50ba32861d5257ddc6575fe819c6b9d040f
|
|
| MD5 |
3c89e791bd09d9bc0e8bdf257ba34e19
|
|
| BLAKE2b-256 |
cbaa41711aeda158352329b6312da18d81f3c6357c5bad01989aaba43c21ea8c
|
Provenance
The following attestation bundles were made for dotpromptz_py-1.4.0-py3-none-any.whl:
Publisher:
publish-pypi.yml on my-three-kingdoms/dotpromptz
-
Statement:
-
Statement type:
https://in-toto.io/Statement/v1 -
Predicate type:
https://docs.pypi.org/attestations/publish/v1 -
Subject name:
dotpromptz_py-1.4.0-py3-none-any.whl -
Subject digest:
8adcd9c445836bc363e5bcf3e389e50ba32861d5257ddc6575fe819c6b9d040f - Sigstore transparency entry: 1004809299
- Sigstore integration time:
-
Permalink:
my-three-kingdoms/dotpromptz@af1130f1e1bf67f9c445bcd34fd8cf8db699f2fe -
Branch / Tag:
refs/heads/main - Owner: https://github.com/my-three-kingdoms
-
Access:
private
-
Token Issuer:
https://token.actions.githubusercontent.com -
Runner Environment:
github-hosted -
Publication workflow:
publish-pypi.yml@af1130f1e1bf67f9c445bcd34fd8cf8db699f2fe -
Trigger Event:
workflow_dispatch
-
Statement type: