Skip to main content

Prompt Declaration Language

Project description

PDL (Prompt Declaration Language)

tests night tests documentation PyPI version shields.io Quay Publish Imports: isort Code style: black linting: pylint security: bandit License CII Best Practices

PDL is a declarative language designed for developers to create reliable, composable LLM prompts and integrate them into software systems. It provides a structured way to specify prompt templates, enforce validation, and compose LLM calls with traditional rule-based systems.

Quick Start | Example | GUI | Key Features | Documentation | API Cheat Sheet

Quick Start

A PDL program is written declaratively, in YAML. The pdl command line tool interprets this program, accumulating messages and sending them to the models as specified by your program. PDL supports both hosted and local models. See here for instructions on how to install an Ollama model locally.

To install the pdl command line tool:

pip install prompt-declaration-language

What's New

Check out AutoPDL, PDL's prompt optimizer tool Spiess et al. (2025)! AutoPDL can be used to optimize any part of a PDL program. This includes few-shots examples and textual prompts, but also prompting patterns. It outputs an optimized PDL program with optimal values.

For a tutorial on how to use AutoPDL, see AutoPDL

Example Program: A Basic LLM Call

PDL GUI

The following program accumulates a single message write a hello world example… and sends it to the ollama/granite-3.2:8b model:

text:
- "write a hello world example, and explain how to run it"
- model: ollama/granite-3.2:8b

To run this program:

pdl <path/to/example.pdl>

For more information on the pdl CLI see here. To try the screenshot on the right live, click here.

Graphical Experience

The screenshot on the right (above) shows PDL's graphical user interface. This GUI allows for interactive debugging and live programming. You may install this via brew install pdl on MacOS. For other platforms, downloads are available here. You may also kick the tires with a web version of the GUI here.

To generate a trace for use in the GUI:

pdl --trace <file.json> <my-example.pdl> 
PDL GUI

Key Features

  • LLM Integration: Compatible with any LLM, including IBM watsonx
  • Prompt Engineering:
    • Template system for single/multi-shot prompting
    • Composition of multiple LLM calls
    • Integration with tools (code execution & APIs)
  • Development Tools:
    • Type checking for model I/O
    • Python SDK
    • Chat API support
    • Live document visualization for debugging
  • Control Flow: Variables, conditionals, loops, and functions
  • I/O Operations: File/stdin reading, JSON parsing
  • API Integration: Native REST API support (Python)

Documentation

API Cheat Sheet

PDL Quick Reference

Installation Details

PDL requires Python 3.11+ (Windows users should use WSL).

# Basic installation
pip install prompt-declaration-language

# Development installation with examples
pip install 'prompt-declaration-language[examples]'

Environment Setup

You can run PDL with LLM models in local using Ollama, or other cloud service. See here for instructions on how to install an Ollama model locally.

If you use watsonx:

export WX_URL="https://{region}.ml.cloud.ibm.com"
export WX_API_KEY="your-api-key"
export WATSONX_PROJECT_ID="your-project-id"

If you use Replicate:

export REPLICATE_API_TOKEN="your-token"

IDE Configuration

Install the YAML Language Support by Red Hat extension in VSCode. VSCode setup for syntax highlighting and validation:

// .vscode/settings.json
{
    "yaml.schemas": {
        "https://ibm.github.io/prompt-declaration-language/dist/pdl-schema.json": "*.pdl"
    },
    "files.associations": {
        "*.pdl": "yaml",
    }
}

Code Examples

Variable Definition & Template Usage

In this example we use external content data.yaml and watsonx as an LLM provider.

description: Template with variables
defs:
  user_input:
    read: ../code/data.yaml
    parser: yaml
text:
- model: watsonx/ibm/granite-34b-code-instruct
  input: |
    Process this input: ${user_input}
    Format the output as JSON.

Python Code Integration

description: Code execution example
text:
- "\nFind a random number between 1 and 20\n"
- def: N
  lang: python
  code: |
    import random
    # (In PDL, set `result` to the output you wish for your code block.)
    result = random.randint(1, 20)
- "\nthe result is (${ N })\n"

Chat

chat interactions:

description: chatbot
text:
- read:
  def: user_input
  message: "hi? [/bye to exit]\n"
  contribute: [context]
- repeat:
    text:
    - model: ollama/granite-code:8b
    - read:
      def: user_input
      message: "> "
      contribute: [context]
  until: ${ user_input == '/bye'}

Trace Telemetry

PDL includes experimental support for gathering trace telemetry. This can be used for debugging or performance analysis, and to see the shape of prompts sent by LiteLLM to models.

For more information see here.

Trace Telemetry

Contributing

See the contribution guidelines for details on:

  • Code style
  • Testing requirements
  • PR process
  • Issue reporting

References

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

prompt_declaration_language-0.9.2.tar.gz (4.9 MB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

prompt_declaration_language-0.9.2-py3-none-any.whl (103.5 kB view details)

Uploaded Python 3

File details

Details for the file prompt_declaration_language-0.9.2.tar.gz.

File metadata

File hashes

Hashes for prompt_declaration_language-0.9.2.tar.gz
Algorithm Hash digest
SHA256 f27baa18434d18bd88d25286758e8263601e2a3abe3487d8b48842443cbcced8
MD5 afa548a1215a4a7ee9187cec9f79c718
BLAKE2b-256 df045b8b9fc240e4c864dc8df1dc36e24adc7c044d3e12f028c04188bf721f0a

See more details on using hashes here.

Provenance

The following attestation bundles were made for prompt_declaration_language-0.9.2.tar.gz:

Publisher: release.yml on IBM/prompt-declaration-language

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file prompt_declaration_language-0.9.2-py3-none-any.whl.

File metadata

File hashes

Hashes for prompt_declaration_language-0.9.2-py3-none-any.whl
Algorithm Hash digest
SHA256 4f650d5aef51c9f7144f5044cf02934aadc7f6f7ff88a4b1b68eb065e5f274a7
MD5 2f11a2d18b6702a321e1ecafd81e24ad
BLAKE2b-256 6700017035d8214d1a6216f2fd9184a72546c86793387d5d4e9cb234dfae8bb6

See more details on using hashes here.

Provenance

The following attestation bundles were made for prompt_declaration_language-0.9.2-py3-none-any.whl:

Publisher: release.yml on IBM/prompt-declaration-language

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page