A Git-like version controller for LLM prompts
Project description
PromptVC
Promptvc is a lightweight and local tool for Git-like version control and A/B testing of your LLM prompts. It acts as a dev tool and version control, designed to help you find the best prompts before you add it into your workflow. Unlike an excel sheet, this is cleaner. Trust us.
Table of Contents
Why?
Instead of guessing which prompt is better, test them:
- Version A: "Summarize this article: {input}"
- Version B: "Write a 2-sentence summary focusing on key insights: {input}"
See which one actually performs better with your data
Install
pip install promptvc
Quick Start
# step 1. initialize promptvc
promptvc init
# step 2. create two prompt versions
promptvc add summarizer "Summarize this article: {input}"
promptvc commit summarizer "version A - simple"
promptvc add summarizer "Write a 2-sentence summary focusing on key insights: {input}"
promptvc commit summarizer "version B - structured"
# step 3. create and edit your test data in the _samples.jsonn
promptvc init-samples summarizer
# inside your summarizer.json, edit your data. oh and dont forget to input your api keys
# step 4. test
promptvc test summarizer 1 2 --llm openai
Usage
CLI Commands
init: Initialize new repo
add <name> <text>: Add prompt version
commit <name> <msg>: Commit staged version
history <name>: View version history
checkout <name> <version>: Get version text
diff <name> <v1> <v2>: Compare versions
list: List all prompts
init-samples <name>: Create samples file
test <name> <v1> <v2>: Compare versions with LLM
Python API
from promptvc.repo import PromptRepo
import openai
def my_llm(prompt):
response = openai.chat.completions.create(
model="gpt-4o-mini",
messages=[{"role": "user", "content": prompt}]
)
return response.choices[0].message.content
# A/B test your prompts
repo = PromptRepo()
samples = [{"input": "Your test data here"}]
results = repo.eval_versions("prompt-name", [1, 2], samples, my_llm)
# compare your outputs
for version_id, result in results.items():
print(f"Version {version_id}:")
for output in result['outputs']:
print(f" {output['output']}")
Full example
# test email writing prompts
promptvc add email-writer "Write a professional email about: {input}"
promptvc commit email-writer "formal version"
promptvc add email-writer "Write a friendly, concise email about: {input}"
promptvc commit email-writer "casual version"
# create your test cases
promptvc init-samples email-writer
# inside your json email-writer_samples.json:
# [
# {"input": "quarterly sales meeting"},
# {"input": "server maintenance window"},
# {"input": "new product launch"}
# ]
# test
promptvc test email-writer 1 2 --llm openai
Config
llm_providers:
openai:
api_key: "your-key-here"
default_model: "gpt-4o-mini"
anthropic:
api_key: "your-key-here"
default_model: "claude-3-sonnet-20240229"
MISC
Tests
For tests conducted, refer to TEST.md
Tutorial
Refer to TUTORIAL.md
Contribution
Refer to CONTRIBUTING.md
FAQ
- Can the library do live interactive testing?
A: The library is not built for live, interactive testing. The Pipeline class can only test a pre-scripted sequence of turns, not a dynamic one. The purpose of this tool is to bring version control and A/B testing to the foundational, single-turn prompts.
- Does it work with any LLM? A: Works with OpenAI, Anthropic for now
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file promptvc-0.0.1.tar.gz.
File metadata
- Download URL: promptvc-0.0.1.tar.gz
- Upload date:
- Size: 16.8 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.1.0 CPython/3.9.6
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
efdf11af1c89de91fc516b229b4639855c4fb69e91f52f550907d2dd6bde9378
|
|
| MD5 |
80e7045068fd6c16c25b130473c96db0
|
|
| BLAKE2b-256 |
8c3519e83e12510acb57d2911c0b3c8d4cc63ebfee407011d3c541e0e38629ab
|
File details
Details for the file promptvc-0.0.1-py3-none-any.whl.
File metadata
- Download URL: promptvc-0.0.1-py3-none-any.whl
- Upload date:
- Size: 9.6 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.1.0 CPython/3.9.6
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
9a457ac590b259483818573014d02a2f996d406081bf74c1227813f58824dca9
|
|
| MD5 |
ac5b9d0119ea844ff22455bc4a8d5578
|
|
| BLAKE2b-256 |
8a0708804abb54f7a3c4760d4d5a834883ff0c2526debe454a0642af4d7e4add
|