Skip to main content

LLM Prompt Manager

Project description

EvolutePrompt

A comprehensive prompt management library for Large Language Models with built-in version control, storage, and activation strategies.

Features

  • Prompt Version Control & Storage: Keep track of prompt versions and their changes over time
  • SQLite Database Integration: Lightweight persistence using Tortoise ORM with SQLite
  • Dynamic Prompt Switching & Activation Logic: Flexible strategies for prompt selection
  • Prompt Categorization: Organize prompts by use case (Chat, Search, Summarization, etc.)
  • Fallback Prompts: Define fallback prompts when primary prompts fail
  • Easy-to-use Python API: Simple to embed and use in your applications
  • Strong Schema Validation: Type-safety with Pydantic
  • Optional FastAPI Integration: Expose prompt management via REST API
  • Native Streamlit UI: Visual interface for prompt management and testing
  • LangChain Integration: Use EvolutePrompt with LangChain for enhanced prompt management

Installation

# Basic installation
pip install EvolutePrompt

# With UI support
pip install EvolutePrompt[ui]

# With API and UI support
pip install EvolutePrompt[all]

Quick Start

from EvolutePrompt import EvolutePrompt
from EvolutePrompt import PromptCategory

# Initialize EvolutePrompt
flow = EvolutePrompt()
flow.init()

# Create a prompt
prompt_builder = flow.create_prompt()
prompt_builder.add_system("You are a helpful assistant.")
prompt_builder.add_user("What is the capital of France?")
prompt = prompt_builder.build()

# Add metadata
prompt.update_metadata(
    description="A simple geography question",
    tags=["geography", "test"],
    category=PromptCategory.QA
)

# Save the prompt
version = flow.save_prompt("capital_question", prompt)
print(f"Saved prompt version: {version}")

# Retrieve the prompt
retrieved = flow.get_prompt("capital_question")

Using the UI

EvolutePrompt includes a native Streamlit UI for prompt management and testing. To use it:

# If installed with pip
EvolutePrompt ui

# Or directly from the module
python -m EvolutePrompt.cli ui

The UI provides:

  • Prompt creation and management
  • Version control and history
  • Fallback configuration
  • Prompt testing interface
  • Database settings

Templates

# Create a template
template = flow.template_from_string(
    "What is the capital of {{country}}?", 
    variables={"country": "France"}
)

# Render the template
rendered = template.render()

# Or with different variables
rendered = template.render(country="Germany")

Strategies

# Simple active prompt selection
prompt = flow.get_active_prompt("my_prompt")

# With fallback
strategy = flow.with_fallback()
prompt = flow.select_prompt("my_prompt", strategy=strategy)

# A/B testing
ab_strategy = flow.create_ab_testing(
    prompt_variants=["prompt_a", "prompt_b"],
    weights=[0.7, 0.3]
)
prompt = flow.select_prompt("doesn't_matter", strategy=ab_strategy)

# Context-aware selection
context_strategy = flow.create_context_aware(
    context_key="language",
    prompt_mapping={
        "en": "english_prompt",
        "es": "spanish_prompt",
        "fr": "french_prompt"
    }
)
prompt = flow.select_prompt(
    "fallback_prompt", 
    strategy=context_strategy,
    context={"language": "es"}
)

FastAPI Integration

# See examples/fastapi_integration.py for a complete example

LangChain Integration

EvolutePrompt can be easily integrated with LangChain to combine robust prompt management with LangChain's orchestration capabilities:

# Convert a EvolutePrompt prompt to LangChain format
def EvolutePrompt_to_langchain_messages(pf_prompt):
    lc_messages = []
    for msg in pf_prompt.messages:
        if msg.role.value == "system":
            lc_messages.append(SystemMessage(content=msg.content))
        elif msg.role.value == "user":
            lc_messages.append(HumanMessage(content=msg.content))
    return lc_messages

# Use EvolutePrompt's versioning with LangChain
prompt = flow.get_active_prompt("my_prompt")
lc_messages = EvolutePrompt_to_langchain_messages(prompt)
result = llm.generate([lc_messages])

For detailed examples, see LangChain Integration.

Version Control

# List all prompts
prompts = flow.list_prompts()

# List all versions of a prompt
versions = flow.list_versions("my_prompt")

# Get a specific version
prompt = flow.get_prompt("my_prompt", version="0.1.0")

# Set a version as active
flow.set_active("my_prompt", "0.2.0")

Development

For information about setting up the development environment, running tests, and contributing to EvolutePrompt, see the following resources:

License

MIT

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

evoluteprompt-0.1.2.tar.gz (30.8 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

evoluteprompt-0.1.2-py3-none-any.whl (39.8 kB view details)

Uploaded Python 3

File details

Details for the file evoluteprompt-0.1.2.tar.gz.

File metadata

  • Download URL: evoluteprompt-0.1.2.tar.gz
  • Upload date:
  • Size: 30.8 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: poetry/1.8.2 CPython/3.12.2 Darwin/24.3.0

File hashes

Hashes for evoluteprompt-0.1.2.tar.gz
Algorithm Hash digest
SHA256 f0728be242ee80172d5a58c576f4dba1bd20310fd4bbd88a748adbd40c03e2a6
MD5 01e5c18db816b842d13570cd0700bec3
BLAKE2b-256 f780ade1b273327e8270b2f97c4dd91e2e004c27e15e0f759a9540cf47c4c388

See more details on using hashes here.

File details

Details for the file evoluteprompt-0.1.2-py3-none-any.whl.

File metadata

  • Download URL: evoluteprompt-0.1.2-py3-none-any.whl
  • Upload date:
  • Size: 39.8 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: poetry/1.8.2 CPython/3.12.2 Darwin/24.3.0

File hashes

Hashes for evoluteprompt-0.1.2-py3-none-any.whl
Algorithm Hash digest
SHA256 a05727862212c672e249ea15b78371e7dc1e94ecd8e5ef5404ff15d56e788977
MD5 90c77a207ae5973d726d9510eaa814cf
BLAKE2b-256 a468b6fd18e687bae1f6d3ebdeafbad3afc5969850dc84e221df4d784f2387b1

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page