Skip to main content

A flexible system for managing and processing prompt templates

Project description

Prompt Manager

Prompt Manager is a flexible system for managing and processing prompt templates for use with language models. It simplifies the process of loading, managing, and formatting prompts, allowing you to focus on developing your AI applications.

Features

  • Load prompts from text files or direct text input
  • Insert variables into prompts using double braces {{variable}}
  • Handle various file types and encoding issues with a universal file opener
  • Flexible API for ease of use

Installation

Install the package using pip:

pip install prompt_manager

Usage

Basic Usage

To load and process a template named example_template with no dependencies:

from prompt_manager import PromptManager

# Initialize the PromptManager with the path to the prompts folder
pm = PromptManager("path/to/prompts/folder")

# Load a prompt template and insert variables
prompt = pm.get_prompt("example_template.txt", name="John Doe", age=30)
print(prompt)

Using Direct Text Prompts

To use a direct text prompt and replace placeholders with direct values:

from prompt_manager import PromptManager

pm = PromptManager()

# Use a direct text prompt
prompt = pm.get_prompt("Hello, {{name}}!", name="Alice")
print(prompt)

Loading File Content as Variables

To load and process a template and replace placeholders with the content of other files:

from prompt_manager import PromptManager

pm = PromptManager("path/to/prompts/folder")

# Load a prompt template and insert file content as variables
prompt = pm.get_prompt("example_template.txt", variable1=pm.load_file("sub_template1.txt"), variable2=pm.load_file("sub_template2.txt"))
print(prompt)

Exception Handling

  • Raises FileNotFoundError if the template file does not exist.
  • Raises ValueError if required placeholders are not provided or if there are issues with file dependencies.

License

This project is licensed under the MIT License - see the LICENSE file for details.

Contributing

If you have suggestions for improvements, feel free to submit a pull request or open an issue.

Contact

Author: Bryan Anye
Email: bryan.anye.5@gmail.com
GitHub: BryanNsoh

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distributions

No source distribution files available for this release.See tutorial on generating distribution archives.

Built Distribution

llm_prompt_manager-0.1.0-py3-none-any.whl (3.8 kB view details)

Uploaded Python 3

File details

Details for the file llm_prompt_manager-0.1.0-py3-none-any.whl.

File metadata

File hashes

Hashes for llm_prompt_manager-0.1.0-py3-none-any.whl
Algorithm Hash digest
SHA256 b9452ba9145230d0f57a971bdaa3cec86335f8d15017ca130c7dc517332823bf
MD5 d1985c7a4aa600e069a2be8156e8e2c4
BLAKE2b-256 2fac15a8006bbf6587a0f033329894bc66a185a8856641b04810445345a0f0ec

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page