Comprehensive prompt management and testing framework for production LLM workflows with automated git versioning
Project description
๐ llmhq-promptops
A comprehensive prompt management and testing framework for production LLM workflows. Built for teams who need reliable, version-controlled prompt development with zero-manual versioning.
โจ Key Features
- ๐ Automated Git Versioning - Zero-manual versioning with git hooks and semantic version detection
- ๐ Uncommitted Change Testing - Test prompts instantly with
:unstaged,:working,:latestreferences - ๐ Python SDK Integration -
pip install llmhq-promptopsfor seamless app integration - ๐งช Version-Aware Testing - Test different prompt versions with comprehensive validation
- ๐ Markdown Reports - Automatic generation of version change documentation
- โ๏ธ Git Hook Automation - Pre-commit and post-commit hooks for seamless developer workflow
๐ Quick Start
Installation
pip install llmhq-promptops
Initialize Your Project
# Create a new project with git hooks
promptops init repo
# Check installation
promptops --help
Create Your First Prompt
# Create a new prompt template
promptops create prompt welcome-message
# Test uncommitted changes
promptops test --prompt welcome-message:unstaged
# Check status of all prompts
promptops test status
๐ Usage Examples
Basic Prompt Resolution
from llmhq_promptops import get_prompt
# Smart default (unstaged if different, else working)
prompt = get_prompt("user-onboarding")
# Specific version references
prompt = get_prompt("user-onboarding:v1.2.1") # Specific version
prompt = get_prompt("user-onboarding:unstaged") # Test uncommitted changes
prompt = get_prompt("user-onboarding:working") # Latest committed (HEAD)
prompt = get_prompt("user-onboarding:latest") # Alias for working
# With variables
rendered = get_prompt("user-onboarding", {"user_name": "Alice", "plan": "Pro"})
print(rendered)
Using with LLM Frameworks
from llmhq_promptops import get_prompt
# Get versioned prompt for any LLM framework
prompt_text = get_prompt(
"user-onboarding:working",
{"user_name": "John", "plan": "Enterprise"}
)
# Use with OpenAI
import openai
response = openai.chat.completions.create(
model="gpt-4",
messages=[{"role": "user", "content": prompt_text}]
)
# Use with Anthropic
import anthropic
client = anthropic.Anthropic()
response = client.messages.create(
model="claude-3-sonnet-20240229",
messages=[{"role": "user", "content": prompt_text}]
)
# Use with any other LLM framework
print(f"Prompt ready for LLM: {prompt_text}")
Advanced Usage
from llmhq_promptops import PromptManager
manager = PromptManager()
# Check if prompt has uncommitted changes
if manager.has_uncommitted_changes("user-onboarding"):
# Test the latest changes
rendered = manager.get_prompt("user-onboarding:unstaged", {"user_name": "Alice"})
else:
# Use committed version
rendered = manager.get_prompt("user-onboarding:working", {"user_name": "Alice"})
# Get prompt differences
diff = manager.get_prompt_diff("user-onboarding", "working", "unstaged")
print(diff)
# List all prompt statuses
statuses = manager.list_prompt_statuses()
for prompt_id, status in statuses.items():
print(f"{prompt_id}: {status}")
๐ง CLI Commands
Initialization & Setup
# Initialize project with interactive setup
promptops init repo --interactive
# Install git hooks for automatic versioning
promptops hooks install
# Check hook status
promptops hooks status
Testing & Development
# Show status of all prompts
promptops test status
# Test specific version references
promptops test --prompt user-onboarding:unstaged
promptops test --prompt user-onboarding:working
promptops test --prompt user-onboarding:v1.2.0
# Compare versions
promptops test diff user-onboarding --version1=working --version2=unstaged
# Test with custom variables
promptops test --prompt user-onboarding --variables '{"name": "Alice", "plan": "Pro"}'
Hook Management
# Install automated versioning hooks
promptops hooks install
# Configure hook behavior
promptops hooks configure
# Check installation status
promptops hooks status
# Remove hooks
promptops hooks uninstall
๐ Project Structure
.promptops/
โโโ prompts/ # YAML prompt templates with metadata and auto-versioning
โโโ configs/ # LLM and environment configurations
โโโ templates/ # Jinja2 template files
โโโ vars/ # Variable definition files
โโโ tests/ # Test datasets (JSON/YAML)
โโโ results/ # Generated test reports (markdown)
โโโ logs/ # LLM call logs and analytics
โโโ reports/ # Auto-generated version change reports
โโโ config.yaml # Git hook configuration
๐ Prompt Schema
# .promptops/prompts/user-onboarding.yaml
# Version automatically managed by git hooks
metadata:
id: user-onboarding
version: "1.2.0" # Auto-incremented by pre-commit hook
description: "User onboarding welcome message"
tags: ["onboarding", "welcome"]
models:
default: gpt-4-turbo
supported: [gpt-4-turbo, claude-3-sonnet, llama2-70b]
template: |
Welcome {{ user_name }}!
Available features:
{% for feature in features %}
- {{ feature }}
{% endfor %}
variables:
user_name: {type: string, required: true}
features: {type: list, default: ["Browse", "Purchase"]}
tests:
- dataset: .promptops/tests/onboarding-data.json
metrics: {max_tokens: 150, min_relevance: 0.8}
๐ Automated Versioning
Semantic Version Rules
- PATCH (1.0.0 โ 1.0.1): Template content changes only
- MINOR (1.0.0 โ 1.1.0): New variables added (backward compatible)
- MAJOR (1.0.0 โ 2.0.0): Required variables removed (breaking change)
Git Hook Workflow
- Developer edits prompt โ Changes saved to working directory
- Test uncommitted changes โ
promptops test --prompt name:unstaged - Git add & commit โ Pre-commit hook automatically:
- Detects changed prompts
- Analyzes changes for semantic versioning
- Updates version numbers in YAML files
- Re-stages updated files
- Commit completes โ Post-commit hook automatically:
- Creates git tags for new versions
- Runs validation tests
- Generates audit logs
Result: Zero manual version management with instant testing capabilities.
๐ Version References
| Reference | Description | Use Case |
|---|---|---|
prompt-name |
Smart default (unstaged if different, else working) | Development |
:unstaged |
Uncommitted changes in working directory | Testing changes |
:working |
Latest committed version (HEAD) | Production |
:latest |
Alias for :working |
Production |
:v1.2.3 |
Specific semantic version | Reproducible builds |
๐ ๏ธ Requirements
- Python: 3.8+
- Git: Required for versioning
- YAML: For prompt template storage
- Jinja2: For template rendering
๐ Dependencies
- Core: Typer (CLI), Jinja2 (templating), PyYAML (parsing)
- Git Integration: GitPython (versioning)
- Typing: typing_extensions (Python 3.8 compatibility)
๐ค Contributing
We welcome contributions! Please see CONTRIBUTING.md for guidelines.
Development Setup
# Clone the repository
git clone https://github.com/your-org/llmhq-promptops.git
cd llmhq-promptops
# Install development dependencies
python -m venv venv
source venv/bin/activate # or `venv\Scripts\activate` on Windows
pip install -r requirements.txt
pip install -e .
# Run tests
python -m pytest tests/
# Test CLI commands
promptops --help
๐ License
This project is licensed under the MIT License - see the LICENSE file for details.
๐ Acknowledgments
- Built with Typer for CLI functionality
- Inspired by modern DevOps practices for infrastructure as code
- Designed for reliable prompt management in production applications
๐ Support
- Issues: GitHub Issues
- Discussions: GitHub Discussions
- Documentation: Full Documentation
Made with โค๏ธ for the LLM development community
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file llmhq_promptops-0.2.0.tar.gz.
File metadata
- Download URL: llmhq_promptops-0.2.0.tar.gz
- Upload date:
- Size: 37.4 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.1.0 CPython/3.12.3
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
66d3574e578e324d196b4e9412ccad681f814fdea724e4eb5fde8446fd821261
|
|
| MD5 |
adb7e2ac12577839b3c3d08ce2fbac12
|
|
| BLAKE2b-256 |
a692484c7f89be0b15d6a5286c3db1bacff28f46c6c5a2ff1ebec241503d5ebe
|
File details
Details for the file llmhq_promptops-0.2.0-py3-none-any.whl.
File metadata
- Download URL: llmhq_promptops-0.2.0-py3-none-any.whl
- Upload date:
- Size: 39.3 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.1.0 CPython/3.12.3
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
946a1e108b630c9c70be6506339fb1a6949095c82c7d1889f0e4de301d89377c
|
|
| MD5 |
62248a064febd2c900729bceca78239b
|
|
| BLAKE2b-256 |
c63ca3115cc92978d2a5210211b6ce4c93ed4fb8ab42414af320540f06880eb6
|