Skip to main content

A simple, yet elegant markup language for defining AI Prompts as Code (APaC). Built to be used by AI agents to automatically prompt for other AI systems

Project description

PromptML (Prompt Markup Language)

logo

A simple, yet elegant markup language for defining AI Prompts as Code (APaC). Built to be used by AI agents to automatically prompt for other AI systems.

Why PromptML ?

PromptML is built to provide a way for prompt engineers to define the AI prompts in a deterministic way. This is a Domain Specific Language (DSL) which defines characteristics of a prompt including context, objective, instructions and it's metadata. A regular prompt is an amalgamation of all these aspects into one entity. PromptML splits it into multiple sections and makes the information explicit.

The language grammar can be found here: grammar.lark

For impatient:

Install promptml-cli from here: https://github.com/narenaryan/promptml-cli to run PromptML programs with OpenAI & Google models.

How PromptML looks ?

The language is simple. You start blocks with @ section annotation. A section ends with @end marker. Comments are started with # key. The prompt files ends with .pml extension.

@prompt
    # Add task context
    @context
    @end

    # Add task objective
    @objective
    # This is the final question or ask
    @end

    # Add one or more instructions to execute the prompt
    @instructions
        @step
        @end
    @end

    # Add one or more examples
    @examples
        @example
            @input
            # Add your example input
            @end
            @output
            # Add your example output
            @end
        @end
    @end

    # Add task constraints
    @constraints
        @length min: 1 max: 10  @end
    @end

    # Add prompt category
    @category
    @end

    # Add custom metadata
    @metadata
    @end
@end

See prompt.pml to see for complete syntax.

Design

Regular text prompts are very abstract in nature. Natural languages are very flexible but provides least reliability. How to provide context for an AI system and ask something ? Shouldn't we specify that explicitly. PromptML is an attempt to make contents of a prompt explicit with a simple language.

Core tenets of PromptML

Below are the qualities PromptML brings to prompt engineering domain:

  1. Standardization instead of fragmentation
  2. Collaboration instead of confusion
  3. Enabling version control-ability
  4. Promoting verbosity for better results

Why not use XML, YAML, or JSON for PromptML ?

First, XML, JSON, and YAML are not DSL languages. They are data formats that can represent any form of data. Second, generative AI needs a strict, yet flexible data language with fixed constraints which evolve along with the domain.

PromptML is built exactly to solve those two issues.

Language grammar is influenced by XML & Ruby, so if you know any one of them, you will feel very comfortable writing prompts in PromptML.

Usage

  1. Install Python requirements
pip install -r requirements.txt
  1. import the parser and parse a promptML file
from promptml.parser import PromptParser

promptml_code = '''
    @prompt
        @context
            This is the context section.
        @end

        @objective
            This is the objective section.
        @end

        @instructions
            @step
                Step 1
            @end
        @end

        @examples
            @example
                @input
                    Input example 1
                @end
                @output
                    Output example 1
                @end
            @end
        @end

        @category
            Prompt Management
        @end

        @constraints
            @length min: 1 max: 10 @end
        @end

        @metadata
            top_p: 0.9
            n: 1
            team: promptml
        @end
    @end
'''

parser = PromptParser(promptml_code)
prompt = parser.parse()

print(prompt)
# Output: {
#     'context': 'This is the context section.',
#     'objective': 'This is the objective section.',
#     'category': 'Prompt Management',
#     'instructions': ['Step 1'],
#     'examples': [
#         {'input': 'Input example 1', 'output': 'Output example 1'}
#     ],
#     'constraints': {'length': {'min': 1, 'max': 10}},
#     'metadata': {'top_p': 0.9, 'n': 1, 'team': 'promptml'}
# }

Defining variables

You can define variables in the promptML file and use them in the prompt context and objective. The variables are defined in the @vars section and referenced using $var syntax in either context or objective sections.

@vars
    name = "John Doe"
@end

@prompt
    @context
        You are a name changing expert.
    @end

    @objective
        You have to change the name: $name to an ancient name.
    @end
@end

Serialization

PromptML document can be serialized into multiple formats like:

  1. XML
  2. YAML
  3. JSON

XML prompts are very-well understood by LLMs and promptML code can be used to generate an XML prompt like this:

From previous example in this README file, we can call a to_xml() method on prompt object to generate a XML prompt.

# XML
serialized = prompt.to_xml()

print(serialized)

Similarly you can generate a YAML or JSON prompt respectively from the same object:

# JSON
prompt.to_json()

# YAML
prompt.to_yaml()

TODO

We are currently working on:

  1. VSCode syntax highlighting support
  2. Add more unit tests
  3. Add RAG example

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

promptml-0.7.1.tar.gz (206.9 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

promptml-0.7.1-py3-none-any.whl (8.3 kB view details)

Uploaded Python 3

File details

Details for the file promptml-0.7.1.tar.gz.

File metadata

  • Download URL: promptml-0.7.1.tar.gz
  • Upload date:
  • Size: 206.9 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.11.11

File hashes

Hashes for promptml-0.7.1.tar.gz
Algorithm Hash digest
SHA256 3dfabe9a8d0155f142467be6f549b972b34c19e9b100d5984883d4704d5ec871
MD5 48409c5b8016ab7e97b60a56473b5fed
BLAKE2b-256 f38e9f24426367e88d4cfbacc486b0454788896549dd45e2ef0e90a77f1c4182

See more details on using hashes here.

File details

Details for the file promptml-0.7.1-py3-none-any.whl.

File metadata

  • Download URL: promptml-0.7.1-py3-none-any.whl
  • Upload date:
  • Size: 8.3 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.11.11

File hashes

Hashes for promptml-0.7.1-py3-none-any.whl
Algorithm Hash digest
SHA256 8863ad9646a44e021a3b40ea792851fc33be595990f354d55300cf92c53c0955
MD5 87f4aec9aca3863df2eb1a507252de91
BLAKE2b-256 ce6183492159b7c1474e8197179b7456bee5614aa668e2271334b6c6bcf2fb5a

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page