Skip to main content

A flexible system for managing and processing prompt templates

Project description

Prompt Manager

Prompt Manager is a flexible system for managing and processing prompt templates for use with language models. It simplifies the process of loading, managing, and formatting prompts, allowing you to focus on developing your AI applications.

Features

  • Load prompts from text files or direct text input
  • Insert variables into prompts using double braces {{variable}}
  • Handle various file types and encoding issues with a universal file opener
  • Flexible API for ease of use

Installation

Install the package using pip:

pip install prompt_manager

Usage

Basic Usage

To load and process a template named example_template with no dependencies:

from prompt_manager import PromptManager

# Initialize the PromptManager with the path to the prompts folder
pm = PromptManager("path/to/prompts/folder")

# Load a prompt template and insert variables
prompt = pm.get_prompt("example_template.txt", name="John Doe", age=30)
print(prompt)

Using Direct Text Prompts

To use a direct text prompt and replace placeholders with direct values:

from prompt_manager import PromptManager

pm = PromptManager()

# Use a direct text prompt
prompt = pm.get_prompt("Hello, {{name}}!", name="Alice")
print(prompt)

Loading File Content as Variables

To load and process a template and replace placeholders with the content of other files:

from prompt_manager import PromptManager

pm = PromptManager("path/to/prompts/folder")

# Load a prompt template and insert file content as variables
prompt = pm.get_prompt("example_template.txt", variable1=pm.load_file("sub_template1.txt"), variable2=pm.load_file("sub_template2.txt"))
print(prompt)

Exception Handling

  • Raises FileNotFoundError if the template file does not exist.
  • Raises ValueError if required placeholders are not provided or if there are issues with file dependencies.

License

This project is licensed under the MIT License - see the LICENSE file for details.

Contributing

If you have suggestions for improvements, feel free to submit a pull request or open an issue.

Contact

Author: Bryan Anye
Email: bryan.anye.5@gmail.com
GitHub: BryanNsoh

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

llm_prompt_manager-0.1.1.tar.gz (3.3 kB view details)

Uploaded Source

Built Distribution

llm_prompt_manager-0.1.1-py3-none-any.whl (3.8 kB view details)

Uploaded Python 3

File details

Details for the file llm_prompt_manager-0.1.1.tar.gz.

File metadata

  • Download URL: llm_prompt_manager-0.1.1.tar.gz
  • Upload date:
  • Size: 3.3 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/5.1.1 CPython/3.11.2

File hashes

Hashes for llm_prompt_manager-0.1.1.tar.gz
Algorithm Hash digest
SHA256 d3013fca138b7eb30062542345b94c6ac44110e2dadea85fa97274d147f67e3c
MD5 e4bd249c39609c92da5ef461d0a5a3c0
BLAKE2b-256 533b7796d447708b6f69faad4d1a65b0d564636772af07fe4c5efd0f829c7056

See more details on using hashes here.

File details

Details for the file llm_prompt_manager-0.1.1-py3-none-any.whl.

File metadata

File hashes

Hashes for llm_prompt_manager-0.1.1-py3-none-any.whl
Algorithm Hash digest
SHA256 9acd3abdefa4eaeba633288d9d3fc4c36906ec45d69cffa40009d2dd5318323c
MD5 f6f46518dfa68b8bc9fe857f9e3f9106
BLAKE2b-256 9f79dabad124bc16c83583b7bc373b586329d7790de721646b9d982c55140fc0

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page