Skip to main content

Dotpromptz is a language-neutral executable prompt template file format for Generative AI.

Project description

LLM Adapters & CLI (Fork Extension)

This fork adds LLM API adapters and a command-line tool so you can run .prompt files directly against OpenAI, Anthropic, Google Gemini, or any third-party compatible service (all adapters support custom base_url).

Quick Start (CLI)

# Install (all adapters included by default)
uv add "dotpromptz-py"

# Set API key
export OPENAI_API_KEY="sk-..."

# Run a prompt (input configured in frontmatter)
runprompt my_prompt.prompt

# Batch mode (auto-detected from frontmatter input)
runprompt my_prompt.prompt

Adapter and model are configured in the .prompt file frontmatter:

---
adapter: openai          # optional — auto-inferred from config.model name
config:
  model: gpt-4o
runtime:
  max_workers: 10       # Concurrent workers for batch processing
  output_dir: ./results # Optional directory for output files
  jsonl: true           # Output in JSONL format
input:
  topic: "AI"
---
Tell me about {{topic}}.

For third-party compatible services (e.g. DeepSeek, Ollama):

---
adapter:
  name: openai
  base_url: https://api.deepseek.com
config:
  model: deepseek-chat
---
Tell me about {{topic}}.

Input Formats

Input data is configured in the .prompt file frontmatter using the input field. The CLI no longer takes a separate input file argument.

1. Inline data (single record):

---
input:
  name: "Alice"
  age: 30
---
Hello {{name}}!

2. File reference (auto-detects single vs batch):

---
input: "data.json"  # Relative to .prompt file
---
Process {{field1}} and {{field2}}.

3. Batch mode (list of records):

---
input:
  - {name: "Alice", age: 30}
  - {name: "Bob", age: 25}
---
Hello {{name}}!

4. JSONL file (always batch):

---
input: "batch.jsonl"
---
Process {{item}}.

File path security: All file paths are resolved relative to the .prompt file directory. Path traversal attempts (e.g. ../../etc/passwd) are rejected.

Quick Start (Python)

import asyncio
from dotpromptz import Dotprompt
from dotpromptz.typing import DataArgument
from dotpromptz.adapters import get_adapter

async def main():
    dp = Dotprompt()
    # Option 1: Use frontmatter input (auto-loaded from .prompt file)
    rendered = dp.render(source)
    
    # Option 2: Override with caller data
    # rendered = dp.render(source, data=DataArgument(input={"topic": "AI"}))
    adapter = get_adapter("openai")
    response = await adapter.generate(rendered)
    print(response.text)

asyncio.run(main())

Supported Adapters

Adapter Env Var
OpenAI OPENAI_API_KEY
Anthropic ANTHROPIC_API_KEY
Google Gemini GOOGLE_API_KEY

All adapters and their SDK dependencies (openai, anthropic, google-genai) are included as core dependencies — no extras needed. All adapters support base_url for third-party compatible endpoints (e.g. DeepSeek, vLLM, Ollama). Configure via frontmatter adapter.base_url or env vars (OPENAI_BASE_URL / ANTHROPIC_BASE_URL / GOOGLE_BASE_URL).

Image Generation (Google Gemini)

Dotprompt supports native image generation via Gemini's generateContent API with response_modalities=["IMAGE"]. To use it, set output.format: image and output.save_path in the frontmatter.

Text-to-image example (draw_cat.prompt):

---
adapter: google
config:
  model: gemini-2.0-flash-exp
output:
  format: image
  save_path: output/cat.png
input:
  style: "watercolor"
---
Draw a {{style}} cat sitting on a windowsill.
runprompt draw_cat.prompt
# → Image saved to: output/cat.png

Image-to-image example (using {{media}} helper for input):

---
adapter: google
config:
  model: gemini-2.0-flash-exp
output:
  format: image
  save_path: output/edited.png
input:
  image_url: "https://example.com/photo.jpg"
  instruction: "Make it black and white"
---
{{media url=image_url}}
{{instruction}}
runprompt edit_image.prompt

Notes:

  • output.save_path is required when format is image. Omitting it raises a validation error at parse time.
  • save_path is validated against path traversal attacks (e.g. ../../etc/passwd) — only paths within the current working directory are allowed.
  • Currently only the Google adapter supports image generation. Only the first generated image is saved.
  • The parent directory of save_path is created automatically if it does not exist.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

dotpromptz_py-1.3.0.tar.gz (168.3 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

dotpromptz_py-1.3.0-py3-none-any.whl (59.6 kB view details)

Uploaded Python 3

File details

Details for the file dotpromptz_py-1.3.0.tar.gz.

File metadata

  • Download URL: dotpromptz_py-1.3.0.tar.gz
  • Upload date:
  • Size: 168.3 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.13.7

File hashes

Hashes for dotpromptz_py-1.3.0.tar.gz
Algorithm Hash digest
SHA256 e4a9e82144e287fd31745f9996cc2bd5ae839ef9198f1f3f4a08dda390ba66a0
MD5 3e069440d94ec1c0e0fe6842cf9ec20e
BLAKE2b-256 5c5a2279c5d57bb82aebe71a7cb536912f9437057674eec09659e46370dfa8a6

See more details on using hashes here.

Provenance

The following attestation bundles were made for dotpromptz_py-1.3.0.tar.gz:

Publisher: publish-pypi.yml on my-three-kingdoms/dotpromptz

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file dotpromptz_py-1.3.0-py3-none-any.whl.

File metadata

  • Download URL: dotpromptz_py-1.3.0-py3-none-any.whl
  • Upload date:
  • Size: 59.6 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.13.7

File hashes

Hashes for dotpromptz_py-1.3.0-py3-none-any.whl
Algorithm Hash digest
SHA256 7a9058758b0aa1f3fdca904ff6a170c2bdadb1e833adb0ba4059644f3244c899
MD5 ba56cddfc4c191f37e85efbadb7dc1b6
BLAKE2b-256 a5232fa19c7ea48354db25cecaab0f89264f0c3ad122441407d8e51cd58d5383

See more details on using hashes here.

Provenance

The following attestation bundles were made for dotpromptz_py-1.3.0-py3-none-any.whl:

Publisher: publish-pypi.yml on my-three-kingdoms/dotpromptz

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page