Skip to main content

A library for working with prompt templates locally or on the Hugging Face Hub.

Project description

Prompt Templates

Prompt templates have become key artifacts for researchers and practitioners working with AI. There is, however, no standardized way of sharing prompt templates. Prompts and prompt templates are shared on the Hugging Face Hub in .txt files, in HF datasets, as strings in model cards, or on GitHub as python strings embedded in scripts, in JSON and YAML files, or in Jinja2 files.

Objectives and non-objectives of this library

Objectives

  • Provide functionality for working with prompt templates locally and sharing them on the Hugging Face Hub.
  • Propose a prompt template standard through .yaml and .json files that enables modular development of complex LLM systems and is interoperable with other libraries

Non-Objective

  • Compete with full-featured prompting libraries like LangChain, ell, etc. The objective is, instead, a simple solution for working with prompt templates locally or on the HF Hub, which is interoperable with other libraries and which the community can build upon.

Documentation

A discussion of the standard prompt format, usage examples, the API reference etc. are available in the docs.

Quick start

Let's use this closed_system_prompts repo of official prompts from OpenAI and Anthropic. These prompt templates have either been leaked or were shared by these LLM providers, but were originally in a non-machine-readable, non-standardized format.

1. Install the library:

pip install prompt-templates

2. List available prompts in a HF Hub repository.

from prompt_templates import list_prompt_templates
files = list_prompt_templates("MoritzLaurer/closed_system_prompts")
print(files)
# ['claude-3-5-artifacts-leak-210624.yaml', 'claude-3-5-sonnet-text-090924.yaml', 'claude-3-5-sonnet-text-image-090924.yaml', 'openai-metaprompt-audio.yaml', 'openai-metaprompt-text.yaml']

3. Download and inspect a prompt template

from prompt_templates import PromptTemplateLoader
prompt_template = PromptTemplateLoader.from_hub(
    repo_id="MoritzLaurer/closed_system_prompts",
    filename="claude-3-5-artifacts-leak-210624.yaml"
)
# Inspect template
print(prompt_template.template)
#[{'role': 'system',
#  'content': '<artifacts_info>\nThe assistant can create and reference artifacts ...'},
# {'role': 'user', 'content': '{{user_message}}'}]
# Check required template variables
print(prompt_template.template_variables)
# ['current_date', 'user_message']
print(prompt_template.metadata)
# {'source': 'https://gist.github.com/dedlim/6bf6d81f77c19e20cd40594aa09e3ecd'}

4. Populate the template with variables

By default, the populated prompt is returned in the OpenAI messages format, which is compatible with most open-source LLM clients.

messages = prompt_template.populate_template(
    user_message="Create a tic-tac-toe game for me in Python",
    current_date="Wednesday, 11 December 2024"
)
print(messages)
# PopulatedPrompt([{'role': 'system', 'content': '<artifacts_info>\nThe assistant can create and reference artifacts during conversations. Artifacts are ...'}, {'role': 'user', 'content': 'Create a tic-tac-toe game for me in Python'}])

5. Use the populated template with any LLM client

from openai import OpenAI
import os
client = OpenAI(api_key=os.environ.get("OPENAI_API_KEY"))
response = client.chat.completions.create(
    model="gpt-4o-mini",
    messages=messages
)
print(response.choices[0].message.content[:100], "...")
# Here's a simple text-based Tic-Tac-Toe game in Python. This code allows two players to take turns pl ...
from huggingface_hub import InferenceClient
client = InferenceClient(api_key=os.environ.get("HF_TOKEN"))
response = client.chat.completions.create(
    model="meta-llama/Llama-3.3-70B-Instruct", 
    messages=messages.to_dict(),
    max_tokens=500
)
print(response.choices[0].message.content[:100], "...")
# <antThinking>Creating a tic-tac-toe game in Python is a good candidate for an artifact. It's a self- ...

If you use an LLM client that expects a format different to the OpenAI messages standard, you can easily reformat the prompt for this client.

from anthropic import Anthropic

messages_anthropic = messages.format_for_client(client="anthropic")

client = Anthropic(api_key=os.environ.get("ANTHROPIC_API_KEY"))
response = client.messages.create(
    model="claude-3-sonnet-20240229",
    system=messages_anthropic["system"],
    messages=messages_anthropic["messages"],
    max_tokens=1000
)
print(response.content[0].text[:100], "...")
# Sure, I can create a tic-tac-toe game for you in Python. Here's a simple implementation: ...

6. Create your own prompt templates

from prompt_templates import ChatPromptTemplate
messages_template = [
    {"role": "system", "content": "You are a coding assistant who explains concepts clearly and provides short examples."},
    {"role": "user", "content": "Explain what {{concept}} is in {{programming_language}}."}
]
template_variables = ["concept", "programming_language"]
metadata = {
    "name": "Code Teacher",
    "description": "A simple chat prompt for explaining programming concepts with examples",
    "tags": ["programming", "education"],
    "version": "0.0.1",
    "author": "Guido van Bossum"
}
prompt_template = ChatPromptTemplate(
    template=messages_template,
    template_variables=template_variables,
    metadata=metadata,
)

print(prompt_template)
# ChatPromptTemplate(template=[{'role': 'system', 'content': 'You are a coding a..., template_variables=['concept', 'programming_language'], metadata={'name': 'Code Teacher', 'description': 'A simple ..., client_parameters={}, custom_data={}, populator_type='double_brace', populator=<prompt_templates.prompt_templates.DoubleBracePopu...)

7. Store or share your prompt templates

You can then store your prompt template locally or share it on the HF Hub.

# save locally
prompt_template.save_to_local("./tests/test_data/code_teacher_test.yaml")
# or save it on the HF Hub
prompt_template.save_to_hub(repo_id="MoritzLaurer/example_prompts_test", filename="code_teacher_test.yaml", create_repo=True)
# CommitInfo(commit_url='https://huggingface.co/MoritzLaurer/example_prompts_test/commit/4cefd2c94f684f9bf419382f96b36692cd175e84', commit_message='Upload prompt template code_teacher_test.yaml', commit_description='', oid='4cefd2c94f684f9bf419382f96b36692cd175e84', pr_url=None, repo_url=RepoUrl('https://huggingface.co/MoritzLaurer/example_prompts_test', endpoint='https://huggingface.co', repo_type='model', repo_id='MoritzLaurer/example_prompts_test'), pr_revision=None, pr_num=None)

TODO

  • many things ...

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

prompt_templates-0.0.11.tar.gz (29.8 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

prompt_templates-0.0.11-py3-none-any.whl (29.9 kB view details)

Uploaded Python 3

File details

Details for the file prompt_templates-0.0.11.tar.gz.

File metadata

  • Download URL: prompt_templates-0.0.11.tar.gz
  • Upload date:
  • Size: 29.8 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: poetry/1.8.5 CPython/3.12.8 Linux/6.8.0-1017-azure

File hashes

Hashes for prompt_templates-0.0.11.tar.gz
Algorithm Hash digest
SHA256 ded3acac6dd43455ac3dbefce622353ae66dfc4739422f9c088b4aa96458b92b
MD5 6537762ff0c0c8b291bfc68530b1ef0c
BLAKE2b-256 79afbc23ff139dbe6f947879bf4b0e5c72ba95181f55922dfd22740d919e42ea

See more details on using hashes here.

File details

Details for the file prompt_templates-0.0.11-py3-none-any.whl.

File metadata

  • Download URL: prompt_templates-0.0.11-py3-none-any.whl
  • Upload date:
  • Size: 29.9 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: poetry/1.8.5 CPython/3.12.8 Linux/6.8.0-1017-azure

File hashes

Hashes for prompt_templates-0.0.11-py3-none-any.whl
Algorithm Hash digest
SHA256 b6d5755a30384dfaa8d43aaef61db68bd1f3c818aa6aa5c56bfe8e2818d433f3
MD5 52a9b239d7ae874b1c9f094e49f9fd94
BLAKE2b-256 c4cfc8ffed0443decc47ec1f681811154b4e8f492caa05d157a889b5929dc0e9

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page