Skip to main content

LLM Prompt Manager

Project description

PromptFlow

A comprehensive prompt management library for Large Language Models with built-in version control, storage, and activation strategies.

Features

  • Prompt Version Control & Storage: Keep track of prompt versions and their changes over time
  • SQLite Database Integration: Lightweight persistence using Tortoise ORM with SQLite
  • Dynamic Prompt Switching & Activation Logic: Flexible strategies for prompt selection
  • Prompt Categorization: Organize prompts by use case (Chat, Search, Summarization, etc.)
  • Fallback Prompts: Define fallback prompts when primary prompts fail
  • Easy-to-use Python API: Simple to embed and use in your applications
  • Strong Schema Validation: Type-safety with Pydantic
  • Optional FastAPI Integration: Expose prompt management via REST API
  • Native Streamlit UI: Visual interface for prompt management and testing
  • LangChain Integration: Use PromptFlow with LangChain for enhanced prompt management

Installation

# Basic installation
pip install promptflow

# With UI support
pip install promptflow[ui]

# With API and UI support
pip install promptflow[all]

Quick Start

from promptflow import PromptFlow
from promptflow import PromptCategory

# Initialize PromptFlow
flow = PromptFlow()
flow.init()

# Create a prompt
prompt_builder = flow.create_prompt()
prompt_builder.add_system("You are a helpful assistant.")
prompt_builder.add_user("What is the capital of France?")
prompt = prompt_builder.build()

# Add metadata
prompt.update_metadata(
    description="A simple geography question",
    tags=["geography", "test"],
    category=PromptCategory.QA
)

# Save the prompt
version = flow.save_prompt("capital_question", prompt)
print(f"Saved prompt version: {version}")

# Retrieve the prompt
retrieved = flow.get_prompt("capital_question")

Using the UI

PromptFlow includes a native Streamlit UI for prompt management and testing. To use it:

# If installed with pip
promptflow ui

# Or directly from the module
python -m promptflow.cli ui

The UI provides:

  • Prompt creation and management
  • Version control and history
  • Fallback configuration
  • Prompt testing interface
  • Database settings

Templates

# Create a template
template = flow.template_from_string(
    "What is the capital of {{country}}?", 
    variables={"country": "France"}
)

# Render the template
rendered = template.render()

# Or with different variables
rendered = template.render(country="Germany")

Strategies

# Simple active prompt selection
prompt = flow.get_active_prompt("my_prompt")

# With fallback
strategy = flow.with_fallback()
prompt = flow.select_prompt("my_prompt", strategy=strategy)

# A/B testing
ab_strategy = flow.create_ab_testing(
    prompt_variants=["prompt_a", "prompt_b"],
    weights=[0.7, 0.3]
)
prompt = flow.select_prompt("doesn't_matter", strategy=ab_strategy)

# Context-aware selection
context_strategy = flow.create_context_aware(
    context_key="language",
    prompt_mapping={
        "en": "english_prompt",
        "es": "spanish_prompt",
        "fr": "french_prompt"
    }
)
prompt = flow.select_prompt(
    "fallback_prompt", 
    strategy=context_strategy,
    context={"language": "es"}
)

FastAPI Integration

# See examples/fastapi_integration.py for a complete example

LangChain Integration

PromptFlow can be easily integrated with LangChain to combine robust prompt management with LangChain's orchestration capabilities:

# Convert a PromptFlow prompt to LangChain format
def promptflow_to_langchain_messages(pf_prompt):
    lc_messages = []
    for msg in pf_prompt.messages:
        if msg.role.value == "system":
            lc_messages.append(SystemMessage(content=msg.content))
        elif msg.role.value == "user":
            lc_messages.append(HumanMessage(content=msg.content))
    return lc_messages

# Use PromptFlow's versioning with LangChain
prompt = flow.get_active_prompt("my_prompt")
lc_messages = promptflow_to_langchain_messages(prompt)
result = llm.generate([lc_messages])

For detailed examples, see LangChain Integration.

Version Control

# List all prompts
prompts = flow.list_prompts()

# List all versions of a prompt
versions = flow.list_versions("my_prompt")

# Get a specific version
prompt = flow.get_prompt("my_prompt", version="0.1.0")

# Set a version as active
flow.set_active("my_prompt", "0.2.0")

Development

For information about setting up the development environment, running tests, and contributing to PromptFlow, see the following resources:

License

MIT

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

evoluteprompt-0.1.1.tar.gz (30.8 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

evoluteprompt-0.1.1-py3-none-any.whl (39.8 kB view details)

Uploaded Python 3

File details

Details for the file evoluteprompt-0.1.1.tar.gz.

File metadata

  • Download URL: evoluteprompt-0.1.1.tar.gz
  • Upload date:
  • Size: 30.8 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: poetry/1.8.2 CPython/3.12.2 Darwin/24.3.0

File hashes

Hashes for evoluteprompt-0.1.1.tar.gz
Algorithm Hash digest
SHA256 7b37d2846d859afd6264ca27af4167274f139cb272ff68a46318714e7e9c9f16
MD5 8f037aa8e8c3a4cfece8a7920469deab
BLAKE2b-256 a42aaf4c81a1a5f2fd2814163499bfd36dee3e133480e6dc1681f7f7b3c93113

See more details on using hashes here.

File details

Details for the file evoluteprompt-0.1.1-py3-none-any.whl.

File metadata

  • Download URL: evoluteprompt-0.1.1-py3-none-any.whl
  • Upload date:
  • Size: 39.8 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: poetry/1.8.2 CPython/3.12.2 Darwin/24.3.0

File hashes

Hashes for evoluteprompt-0.1.1-py3-none-any.whl
Algorithm Hash digest
SHA256 9d456c8fdd3cd19469418ba657e6692bd01acc81aab5b5f8d6f6224878e51055
MD5 2f8c4ef397f85baf518b3cad0ac0b0df
BLAKE2b-256 e51d775052a8eb783cf8342ed2b499993f1b3b954d4a71a1344ebe57b5e94203

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page