LLM Prompt Manager
Project description
PromptFlow
A comprehensive prompt management library for Large Language Models with built-in version control, storage, and activation strategies.
Features
- Prompt Version Control & Storage: Keep track of prompt versions and their changes over time
- SQLite Database Integration: Lightweight persistence using Tortoise ORM with SQLite
- Dynamic Prompt Switching & Activation Logic: Flexible strategies for prompt selection
- Prompt Categorization: Organize prompts by use case (Chat, Search, Summarization, etc.)
- Fallback Prompts: Define fallback prompts when primary prompts fail
- Easy-to-use Python API: Simple to embed and use in your applications
- Strong Schema Validation: Type-safety with Pydantic
- Optional FastAPI Integration: Expose prompt management via REST API
- Native Streamlit UI: Visual interface for prompt management and testing
- LangChain Integration: Use PromptFlow with LangChain for enhanced prompt management
Installation
# Basic installation
pip install promptflow
# With UI support
pip install promptflow[ui]
# With API and UI support
pip install promptflow[all]
Quick Start
from promptflow import PromptFlow
from promptflow import PromptCategory
# Initialize PromptFlow
flow = PromptFlow()
flow.init()
# Create a prompt
prompt_builder = flow.create_prompt()
prompt_builder.add_system("You are a helpful assistant.")
prompt_builder.add_user("What is the capital of France?")
prompt = prompt_builder.build()
# Add metadata
prompt.update_metadata(
description="A simple geography question",
tags=["geography", "test"],
category=PromptCategory.QA
)
# Save the prompt
version = flow.save_prompt("capital_question", prompt)
print(f"Saved prompt version: {version}")
# Retrieve the prompt
retrieved = flow.get_prompt("capital_question")
Using the UI
PromptFlow includes a native Streamlit UI for prompt management and testing. To use it:
# If installed with pip
promptflow ui
# Or directly from the module
python -m promptflow.cli ui
The UI provides:
- Prompt creation and management
- Version control and history
- Fallback configuration
- Prompt testing interface
- Database settings
Templates
# Create a template
template = flow.template_from_string(
"What is the capital of {{country}}?",
variables={"country": "France"}
)
# Render the template
rendered = template.render()
# Or with different variables
rendered = template.render(country="Germany")
Strategies
# Simple active prompt selection
prompt = flow.get_active_prompt("my_prompt")
# With fallback
strategy = flow.with_fallback()
prompt = flow.select_prompt("my_prompt", strategy=strategy)
# A/B testing
ab_strategy = flow.create_ab_testing(
prompt_variants=["prompt_a", "prompt_b"],
weights=[0.7, 0.3]
)
prompt = flow.select_prompt("doesn't_matter", strategy=ab_strategy)
# Context-aware selection
context_strategy = flow.create_context_aware(
context_key="language",
prompt_mapping={
"en": "english_prompt",
"es": "spanish_prompt",
"fr": "french_prompt"
}
)
prompt = flow.select_prompt(
"fallback_prompt",
strategy=context_strategy,
context={"language": "es"}
)
FastAPI Integration
# See examples/fastapi_integration.py for a complete example
LangChain Integration
PromptFlow can be easily integrated with LangChain to combine robust prompt management with LangChain's orchestration capabilities:
# Convert a PromptFlow prompt to LangChain format
def promptflow_to_langchain_messages(pf_prompt):
lc_messages = []
for msg in pf_prompt.messages:
if msg.role.value == "system":
lc_messages.append(SystemMessage(content=msg.content))
elif msg.role.value == "user":
lc_messages.append(HumanMessage(content=msg.content))
return lc_messages
# Use PromptFlow's versioning with LangChain
prompt = flow.get_active_prompt("my_prompt")
lc_messages = promptflow_to_langchain_messages(prompt)
result = llm.generate([lc_messages])
For detailed examples, see LangChain Integration.
Version Control
# List all prompts
prompts = flow.list_prompts()
# List all versions of a prompt
versions = flow.list_versions("my_prompt")
# Get a specific version
prompt = flow.get_prompt("my_prompt", version="0.1.0")
# Set a version as active
flow.set_active("my_prompt", "0.2.0")
Development
For information about setting up the development environment, running tests, and contributing to PromptFlow, see the following resources:
- Testing Documentation: Instructions for running and writing tests
- Contributing Guidelines: Guidelines for contributing to the project
License
MIT
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file evoluteprompt-0.1.0.tar.gz.
File metadata
- Download URL: evoluteprompt-0.1.0.tar.gz
- Upload date:
- Size: 30.6 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: poetry/1.8.2 CPython/3.12.2 Darwin/24.3.0
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
48b57418e3a271feababe26aec8a98750409b539447901091786648d48fd8e88
|
|
| MD5 |
045fb18d8e3817fdac6bbab798e470af
|
|
| BLAKE2b-256 |
fb9f3309d233f7604dc8b3c1d01b8b0c1976550a9ff76a8c0108b8fafa2805d3
|
File details
Details for the file evoluteprompt-0.1.0-py3-none-any.whl.
File metadata
- Download URL: evoluteprompt-0.1.0-py3-none-any.whl
- Upload date:
- Size: 39.8 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: poetry/1.8.2 CPython/3.12.2 Darwin/24.3.0
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
3a761d17c5fc133d1d48367d1fd7ef012a29efc8d2e0f794bda037bdc40d5e0a
|
|
| MD5 |
962747d731c54970fef4058c34e0adcd
|
|
| BLAKE2b-256 |
2ea32b20cc2808d89697270fa2d2c044c6ebf09d1b72bbca63e59d958eab3488
|