Skip to main content

Chorus - LLM Prompt Versioning Tool with Dual Versioning System

Project description

Chorus

A Python package for LLM prompt versioning and tracking with dual versioning system and web interface.

Features

  • Dual Versioning System: Project version (semantic) + Agent version (incremental)
  • Automatic Prompt Interception: Automatically intercepts and extracts prompts from LLM API calls
  • Execution Tracking: Captures inputs, outputs, and execution times
  • Web Interface: Beautiful web UI for prompt management and visualization
  • CLI Tools: Command-line interface for prompt management
  • Export/Import: JSON export/import for prompt data
  • Semantic Versioning: Full support for semantic versioning of prompts

Installation

pip install prompt-chorus

Quick Start

1. Basic Usage

from prompt_chorus import chorus

@chorus(system_name="my_ai_system", project_version="1.0.0", description="Basic Q&A prompt")
def ask_question(question: str) -> str:
    # Your LLM API calls are automatically intercepted and prompts extracted
    import openai
    response = openai.ChatCompletion.create(
        model="gpt-3.5-turbo",
        messages=[
            {"role": "system", "content": "You are a helpful assistant."},
            {"role": "user", "content": f"Answer: {question}"}
        ]
    )
    return response.choices[0].message.content

# Run the function - prompts are automatically tracked
result = ask_question("What is Python?")

2. Auto-versioning

@chorus(system_name="text_processor", description="Auto-versioned prompt")
def process_text(text: str) -> str:
    # Prompts from any LLM API call are automatically captured
    import anthropic
    response = anthropic.Anthropic().messages.create(
        model="claude-3-sonnet-20240229",
        max_tokens=1000,
        messages=[
            {"role": "user", "content": f"Process this text: {text}"}
        ]
    )
    return response.content[0].text

# Each time you modify the prompt, agent version auto-increments

3. CLI Usage

# List all tracked prompts
chorus list

# Show specific prompt details
chorus show ask_question 1.0.0

# Start web interface
chorus web

# Export prompts
chorus export --output my_prompts.json

4. Web Interface

chorus web --port 3000

Open your browser to http://localhost:3000 for a beautiful web interface to manage your prompts.

How It Works

Chorus automatically intercepts LLM API calls made within decorated functions and extracts the prompts for versioning. No need to manually specify prompts - just use your existing LLM libraries normally.

Supported LLM Providers

  • OpenAI: openai.ChatCompletion.create(), openai.Completion.create()
  • Anthropic: anthropic.Anthropic().messages.create()
  • Google: google.generativeai.GenerativeModel.generate_content()
  • Cohere: cohere.Client().chat(), cohere.Client().generate()
  • LangChain: All LangChain LLM calls
  • And more: Extensible architecture for additional providers

Advanced Features

Dual Versioning System

Chorus uses a dual versioning approach:

  • Project Version: Semantic version for project changes (e.g., "1.0.0")
  • Agent Version: Incremental version for prompt changes (auto-incremented)

Prompt Tracking

  • Automatic interception of LLM API calls (OpenAI, Anthropic, etc.)
  • Real-time prompt extraction from intercepted messages
  • Execution time tracking
  • Input/output capture
  • Error handling and logging

Web Interface Features

  • Visual prompt management
  • Version comparison
  • Execution history
  • Export/import functionality

Development

Setup

git clone https://github.com/ConsensusLabsAI/prompt-chorus.git
cd prompt-chorus
pip install -e .

Testing

pip install -e ".[dev]"
pytest

License

MIT License - see LICENSE file for details.

Contributing

Contributions are welcome! Please feel free to submit a Pull Request.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

prompt_chorus-0.1.2.tar.gz (21.1 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

prompt_chorus-0.1.2-py3-none-any.whl (22.4 kB view details)

Uploaded Python 3

File details

Details for the file prompt_chorus-0.1.2.tar.gz.

File metadata

  • Download URL: prompt_chorus-0.1.2.tar.gz
  • Upload date:
  • Size: 21.1 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.13.5

File hashes

Hashes for prompt_chorus-0.1.2.tar.gz
Algorithm Hash digest
SHA256 c50f60a280bac3e5edaffe6d7a3359dfbfca5b06356a64ffc90230e6373bed16
MD5 f3ba5d6b80ebf6ce289742510f4f6d28
BLAKE2b-256 5910416f16e6cbff58939cc5fced7e004d569d60f58304c229009674476f3ce4

See more details on using hashes here.

File details

Details for the file prompt_chorus-0.1.2-py3-none-any.whl.

File metadata

  • Download URL: prompt_chorus-0.1.2-py3-none-any.whl
  • Upload date:
  • Size: 22.4 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.13.5

File hashes

Hashes for prompt_chorus-0.1.2-py3-none-any.whl
Algorithm Hash digest
SHA256 fdae6a2b9c4938f559a0fd1d3a5fd112ca25b46a8ba6a72e2ecff08a8763255e
MD5 6b78d42bc304018bda736d19d2108062
BLAKE2b-256 f707518a9595ec1b29e10e0a874ad65271d5dbda584750e64315d39f7ccea3a2

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page