Skip to main content

Structured prompts using template strings

Project description

t-prompts

CI Coverage TypeScript Coverage Documentation PyPI Python 3.14+ License: MIT Code style: ruff

Provenance-preserving prompts for LLMs using Python 3.14's template strings

What is t-prompts?

t-prompts turns Python 3.14+ t-strings into navigable trees that preserve full provenance information (expression text, conversions, format specs). Perfect for building, composing, and auditing LLM prompts.

Unlike f-strings which immediately evaluate to strings, t-prompts keeps the structure intact so you can:

  • Trace exactly which variable produced which part of your prompt
  • Navigate nested prompt components programmatically
  • Compose complex prompts from smaller, reusable pieces
  • Audit with complete provenance for compliance and debugging
  • Validate types at prompt creation (no accidental str(obj) surprises)

Requirements: Python 3.14+

Quick Example

from t_prompts import prompt

# Create a structured prompt
instructions = "Always answer politely."
p = prompt(t"Obey {instructions:inst}")

# Renders like an f-string
print(str(p))  # "Obey Always answer politely."

# But preserves full provenance
node = p['inst']
print(node.expression)  # "instructions" (original variable name)
print(node.value)       # "Always answer politely."

This enables riching tooling: Widget

Targeted Use Cases

  • Prompt Debugging: "What exactly did this tangle of code render to?"
  • Prompt Optimization (Performance): "What wording / content best achieves my goal?"
  • Prompt Optimization (Size): "How do I get the same result with fewer words?"
  • Prompt Compacting: "LLM tells me to keep it short, now what do I do?"

Caveats

While this library targets the creation of structured multi-modal prompts, despite the name, there is nothing in particular tying this library to LLMs / Generative models. (It is more "t" than "prompts") To use it for an actual LLM call, you would need to convert the IR into a model specific form (though for text, it could be as simple as str(prompt))

Documentation

📚 Full documentation: https://habemus-papadum.github.io/t-prompts/

Installation

pip install t-prompts

Or with uv:

uv add t-prompts

Development

This project uses UV and PNPM for dependency management.

# Onetime Setup (per dev machine)
curl -LsSf https://astral.sh/uv/install.sh | sh # Other options exist!
curl -fsSL https://get.pnpm.io/install.sh | sh - # Other options exist!
./scripts/setup-visual-tests.sh

# Repo setup (per clone)
./scripts/setup.sh

# Lint and format
uv run ruff check .
uv run ruff format .

# Build documentation
uv run mkdocs serve

See Developer Setup for detailed instructions.

License

MIT License - see LICENSE file for details.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

t_prompts-0.18.1.tar.gz (793.1 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

t_prompts-0.18.1-py3-none-any.whl (1.2 MB view details)

Uploaded Python 3

File details

Details for the file t_prompts-0.18.1.tar.gz.

File metadata

  • Download URL: t_prompts-0.18.1.tar.gz
  • Upload date:
  • Size: 793.1 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: python-httpx/0.28.1

File hashes

Hashes for t_prompts-0.18.1.tar.gz
Algorithm Hash digest
SHA256 945c22cdbeeb472aec267a344800024a59d1a6f595b91e8e76e946a5d8f49631
MD5 f3ef7d95eb009625ec8031a279441aa3
BLAKE2b-256 b167ef6768f4c8992753ecbb91e8487377201b0c88a206cf46d43213f3279478

See more details on using hashes here.

File details

Details for the file t_prompts-0.18.1-py3-none-any.whl.

File metadata

  • Download URL: t_prompts-0.18.1-py3-none-any.whl
  • Upload date:
  • Size: 1.2 MB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: python-httpx/0.28.1

File hashes

Hashes for t_prompts-0.18.1-py3-none-any.whl
Algorithm Hash digest
SHA256 f80bad7e5eaedc7cd839af8c049bc22dec887194467f1cb8d1774a4594a8b1f4
MD5 c10b4938cc6a53be2d97f32edfa6703a
BLAKE2b-256 33b0fe11a21a5a374a7053ca83d2eb4cdaa36dcf2442fc85431238fdb1d2cc77

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page