Skip to main content

Add your description here

Project description

doprompt-py

DoPrompt is a python package to parse and execute dotprompt files - a filetype created by Google to store prompts outside of the code itself.

This library exists to create vendor independent tooling for a better way to store and interact with prompts.

Please check out the visual studio code extension to easily play around with your prompts without ripping them out of your code.

Example

  1. Install doprompt
pip install doprompt
  1. Create a chatbot.prompt file
---
model: openai/gpt-4o
config:
  temperature: 0.9
input:
  schema:
    userQuestion: string
---
{{role "system"}}
You are a helpful AI assistant that really loves to talk about food. Try to work
food items into all of your conversations.
{{role "user"}}
{{userQuestion}}
  1. Execute the prompt You can use our completion API to automatically dispatch the prompt to the right vendor library and pass in the parameters you configured:
import doprompt

prompt = DoPrompt("./chatbot.prompt")
print(prompt.complete({'userQuestion': 'Which way is the library?'})

Alternatively if you need custom APIs, settings or vendors you can also manually pass down any parameters. The following snippet results in equivalent behaviour

import doprompt
from openai import OpenAI

prompt = DoPrompt("./chatbot.prompt")

openai = OpenAI()

messages = prompt.get_messages(values={
    'userQuestion': 'What is the capital of France?',
})


client = OpenAI()
completion = client.chat.completions.create(
    messages=prompt.get_messages(vals),
    model=prompt.model_name,
    temperature=prompt.get_config('temperature'),
)
print(completion.choices[0].message.content)

Backgorund

Hardcoding prompt is a very constraining solution, since prompts often need very interactive development which is hard once they're embedded in code. The alterantive would be to use a prompt management platform, which create an additional dependency and point of failure, whilst siloing away your prompts from the parts they are actually very integrated into.

Separating them into their own file keeps them close to the code while allowing additional tooling to help you interactively develop them.

Supported features

  • handlebars templating
  • config settings
  • roles
  • schema validation
  • media support
  • alternative templating languages

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

doprompt-0.1.1.tar.gz (4.7 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

doprompt-0.1.1-py3-none-any.whl (5.1 kB view details)

Uploaded Python 3

File details

Details for the file doprompt-0.1.1.tar.gz.

File metadata

  • Download URL: doprompt-0.1.1.tar.gz
  • Upload date:
  • Size: 4.7 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: uv/0.4.29

File hashes

Hashes for doprompt-0.1.1.tar.gz
Algorithm Hash digest
SHA256 374d4e32094e067a71e247ee8def42f3f57b833aec67159241ea22f838bf38ad
MD5 130b1def3a796f6781997c70e33b9eaa
BLAKE2b-256 10003ffd4975c0da741b8bf770e73687d8752f8fbfdfafe13152dff7a974865a

See more details on using hashes here.

File details

Details for the file doprompt-0.1.1-py3-none-any.whl.

File metadata

  • Download URL: doprompt-0.1.1-py3-none-any.whl
  • Upload date:
  • Size: 5.1 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: uv/0.4.29

File hashes

Hashes for doprompt-0.1.1-py3-none-any.whl
Algorithm Hash digest
SHA256 46e9005ce5794bf55856c6a6d459d7587d291b9ea0ff54095b43de87768b0e6e
MD5 7eadfac5084d41e2701edae125de0fbf
BLAKE2b-256 add034a133da22e7f11af26aadc4749e6c1637e1e0c37f649ec7c0631c40668a

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page