Skip to main content

A flexible system for managing and processing prompt templates

Project description

Prompt Manager

Prompt Manager is a flexible system for managing and processing prompt templates for use with language models. It simplifies the process of loading, managing, and formatting prompts, allowing you to focus on developing your AI applications.

Features

  • Load prompts from text files or direct text input
  • Insert variables into prompts using double braces {{variable}}
  • Handle various file types and encoding issues with a universal file opener
  • Flexible API for ease of use

Installation

Install the package using pip:

pip install prompt_manager

Usage

Basic Usage

To load and process a template named example_template with no dependencies:

from prompt_manager import PromptManager

# Initialize the PromptManager with the path to the prompts folder
pm = PromptManager("path/to/prompts/folder")

# Load a prompt template and insert variables
prompt = pm.get_prompt("example_template.txt", name="John Doe", age=30)
print(prompt)

Using Direct Text Prompts

To use a direct text prompt and replace placeholders with direct values:

from prompt_manager import PromptManager

pm = PromptManager()

# Use a direct text prompt
prompt = pm.get_prompt("Hello, {{name}}!", name="Alice")
print(prompt)

Loading File Content as Variables

To load and process a template and replace placeholders with the content of other files:

from prompt_manager import PromptManager

pm = PromptManager("path/to/prompts/folder")

# Load a prompt template and insert file content as variables
prompt = pm.get_prompt("example_template.txt", variable1=pm.load_file("sub_template1.txt"), variable2=pm.load_file("sub_template2.txt"))
print(prompt)

Exception Handling

  • Raises FileNotFoundError if the template file does not exist.
  • Raises ValueError if required placeholders are not provided or if there are issues with file dependencies.

License

This project is licensed under the MIT License - see the LICENSE file for details.

Contributing

If you have suggestions for improvements, feel free to submit a pull request or open an issue.

Contact

Author: Bryan Anye
Email: bryan.anye.5@gmail.com
GitHub: BryanNsoh

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

llm_prompt_manager-0.1.2.tar.gz (3.4 kB view details)

Uploaded Source

Built Distribution

llm_prompt_manager-0.1.2-py3-none-any.whl (3.9 kB view details)

Uploaded Python 3

File details

Details for the file llm_prompt_manager-0.1.2.tar.gz.

File metadata

  • Download URL: llm_prompt_manager-0.1.2.tar.gz
  • Upload date:
  • Size: 3.4 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/5.1.1 CPython/3.11.2

File hashes

Hashes for llm_prompt_manager-0.1.2.tar.gz
Algorithm Hash digest
SHA256 8e97aaf6622a166b15cf279e12cb75b79f4d5401b4e446ebe951aee2a1fdc710
MD5 d10b211014634ecdbffa68ccd31bc23b
BLAKE2b-256 c6030592e8f8113d5653891058ff4408c1bd723703c7cba72477d3dd2d534158

See more details on using hashes here.

File details

Details for the file llm_prompt_manager-0.1.2-py3-none-any.whl.

File metadata

File hashes

Hashes for llm_prompt_manager-0.1.2-py3-none-any.whl
Algorithm Hash digest
SHA256 c0ed3d70d4c2383c1ba226b91be9b7a8e26ab4f67624f67f16b40b1e43c6fba4
MD5 d1084a23cf82730def5389e4b094df2e
BLAKE2b-256 95d836e102f084d8c19392a48999f609cfc0f02899d99beb794ce24b05b0b220

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page