Lightweight, extensible prompt templating for LLMs
Project description
LitePrompt: A Flexible Template-Based Prompt Engineering Library
Scalability isn’t just about growing—it’s about staying lean and focusing on what truly matters. Flexibility isn’t just a feature; it’s the core of LitePrompt, built to adapt to your unique needs.
LitePrompt is a Python library designed to provide a flexible and extensible system for managing and generating prompts for conversational AI systems. It leverages template-based prompt generation, with support for various storage backends, caching mechanisms, and structured prompt formats. The library allows developers to create complex, context-aware prompts with minimal setup and dependencies.
Key Features
- Template-based prompt generation: Utilizes Jinja2 for flexible template rendering.
- Multiple storage backends support: Load templates from local files, Amazon S3, Google Cloud Storage, and local packages.
- Built-in caching: Optimizes performance by caching templates.
- Role-based messaging: Generates structured prompts with user, system, and assistant roles.
- Multi-modal content: Supports text, images, and file-based content.
- Flexible template management: Load and manage templates with ease.
Usage Instructions
Prerequisites
-
Python version: >=3.10
-
Required packages:
PyYAML>=5.4Jinja2>=3.0Jinja2schema==0.1.4
-
Optional packages (install based on your use case):
boto3(for S3 support)google-cloud-storage(for GCS support)tiktoken(for token counting)cachetools
Installation
You can install LitePrompt using pip or directly from the source:
# Install via pip
pip install liteprompt
# Install from source
git clone https://github.com/ggiallo28/liteprompt.git
cd liteprompt
pip install .
Quick Start
Here's a quick example of how to use LitePrompt to generate a prompt from a template:
from liteprompt.loaders import LiteLocalFSTemplateLoader
from liteprompt.prompt import LitePrompt
# Create a template loader
loader = LiteLocalFSTemplateLoader(
template_path="./templates/main_prompt.yml.j2"
)
# Define template data
template_data = {
"character_name": "Ada",
"username": "User",
"user_query": "Hello!"
}
# Create and render prompt
prompt = LitePrompt(
template_data=template_data,
template_loader=loader
)
# Access generated messages
print(prompt.messages)
More Detailed Examples
1. Using Raw Templates:
raw_template = """
- id: system_prompt
role: system
content: {{ content }}
"""
prompt = LitePrompt(
raw_template=raw_template,
template_data={"content": "You are a helpful assistant."}
)
2. Multi-modal Content:
raw_template = """
- id: user_parts
role: user
content:
- type: text
text: {{ text }}
- type: image_url
image_url:
url: {{ image_url }}
detail: high
"""
prompt = LitePrompt(
raw_template=raw_template,
template_data={
"text": "Please analyze this image.",
"image_url": "https://example.com/image.png"
}
)
3. Message Manipulation:
# Create a base prompt
prompt = LitePrompt(raw_template=template, template_data=data)
# Replace an existing message
prompt.assign(
at="chat_log_1",
item=LiteMessage(
id="chat_log_1",
role="user",
content="Alice: Can you help me with something?"
)
)
# Append a message after a specific ID
prompt.append(
item=LiteMessage(
id="chat_log_3",
role="assistant",
content="Bot: Sure, what do you need help with?"
),
after="chat_log_2"
)
# Prepend a message before everything
prompt.prepend(
item=LiteMessage(
id="system_msg",
role="system",
content="System: Starting new chat session."
)
)
# Get list of message IDs
message_ids = prompt.list_message_ids()
Template Management and Structure
The LitePrompt library utilizes a flexible system for loading and managing templates, which can be stored across different backends such as local files, cloud storage (S3, Google Cloud), or packaged templates. The default template loader is capable of handling various types of prompt templates that are structured into multiple sections for different use cases.
Default Loader Functionality
The default loader is designed to load templates from the local filesystem, allowing for an organized and modular template structure. It supports various types of templates, including:
- Agent Prompts: These templates define different agent behaviors, such as interactive, memory-based, or task-specific agents.
- Procedure Prompts: Templates focused on guiding an agent through specific steps or procedures, including detailed instructions, context, and summaries.
- Memory-Based Prompts: Templates for managing memory-related tasks, like recalling past interactions or episodic memories.
- Tool and System Prompts: Templates that help integrate tools or system-level instructions with the conversational flow.
- User Query and Response Templates: Specific templates for handling user queries, formulating responses, and managing chat histories.
These templates are stored within subdirectories, allowing for easy organization and flexibility when adding or updating individual components. The system is designed to load only the relevant templates based on the specific requirements of the prompt generation, making it efficient for creating context-aware and dynamic interactions.
The default loader seamlessly integrates these templates into the prompt generation process, allowing developers to specify the path and context for each prompt while maintaining clean separation of concerns across the different sections of the prompt.
Troubleshooting
Common Issues and Solutions:
-
Missing Dependencies:
- Error:
ImportError: 'boto3' is required but not installed - Solution:
pip install boto3
- Error:
-
Template Not Found:
- Ensure the template path is correct:
loader = LiteLocalFSTemplateLoader( template_path="./absolute/path/to/template.yml.j2" )
-
Template Rendering Errors:
- Check for syntax errors in the template.
- Ensure all required template variables are provided.
- Enable debug logging to inspect errors:
import logging logging.basicConfig(level=logging.DEBUG)
Data Flow
The flow of data through the LitePrompt system involves multiple components that work together to generate formatted prompts for AI models:
[Template Source] -> [Template Loader] -> [Template Engine]
| |
v v
[Template Cache] <- [Prompt Generator] -> [Formatted Messages]
^ |
| v
[Template Registry] [Content Validation]
Component Interactions:
- Template Loaders: Fetch templates from various sources like local files, Amazon S3, Google Cloud Storage, or local packages.
- Template Cache: Caches templates to improve performance.
- Template Engine (Jinja2): Renders templates with provided data.
- Prompt Generator: Creates structured message formats, ready to be used by AI models.
- Content Validation: Ensures the correctness of the rendered content, such as proper formatting and required fields.
- Template Registry: Manages template discovery and versioning for reuse.
- Cache System: Optimizes template loading for faster response times.
Why Choose LitePrompt?
LitePrompt is designed to be a flexible, efficient, and lightweight solution for managing prompts in conversational AI systems. Its modular architecture separates core functionality from advanced features, so you only install what you actually need. Whether you're working with local files, cloud storage, or caching, the library adapts to your workflow without unnecessary bloat.
At the heart of LitePrompt is a philosophy of practical scalability—not just the ability to grow, but the freedom to stay small and focused. Every feature is plug-and-play, every dependency is optional, and every implementation puts your specific needs first. LitePrompt stays lean so you can move fast, customize freely, and scale only when and how it makes sense for you—even if that means rewriting part of the code to fit your use case.
License
This project is licensed under the MIT License. See the LICENSE file for more details.
Contributing
We welcome contributions!
Project details
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file liteprompt-0.2.10.tar.gz.
File metadata
- Download URL: liteprompt-0.2.10.tar.gz
- Upload date:
- Size: 24.0 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.1.0 CPython/3.11.12
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
a11e62d04293e8ad42a14f4b5494f9e87dce4e08b5ed294cf7e66981ee34edee
|
|
| MD5 |
6456ea3a5e81e40daacc2a1c6500693b
|
|
| BLAKE2b-256 |
74da06cffb6cc90a442979713a1fd6c468bc0fbaeb60c8ac3bcfa48eb8ccff80
|
File details
Details for the file liteprompt-0.2.10-py3-none-any.whl.
File metadata
- Download URL: liteprompt-0.2.10-py3-none-any.whl
- Upload date:
- Size: 26.7 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.1.0 CPython/3.11.12
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
fb0fb3e017b9c7928f465c02c2835ac22c06a905730cd5135feb5434b4f2b159
|
|
| MD5 |
78ecde4150c3422c8a4fa4bb992cad7e
|
|
| BLAKE2b-256 |
fc492688d76e7fbde34b0291eba0f6b3f662ef6a31c4bd0cdd752ae839c256a4
|