Skip to main content

A CLI tool for managing and comparing LLM prompts using semantic diffing

Project description

LLM Prompt Semantic Diff

A CLI tool for managing and comparing LLM prompts using semantic diffing instead of traditional text-based comparison.

License Python

Overview

LLM Prompt Semantic Diff delivers a lightweight command‑line workflow for managing, packaging, and semantic diffing of Large Language Model prompts. It addresses the blind spot where ordinary text‑based git diff fails to reveal meaning‑level changes that materially affect model behaviour.

Read More: https://medium.com/@aatakansalar/catching-prompt-regressions-before-they-ship-semantic-diffing-for-llm-workflows-feb3014ccac3

Key Features

  • F-1: prompt init - Generates skeletal prompt files and default manifests
  • F-2: prompt pack - Embeds prompts into .pp.json with semantic versioning
  • F-3: prompt diff - Semantic comparison with percentage scores and exit codes
  • F-4: Dual embedding providers (OpenAI cloud + SentenceTransformers local)
  • F-5: JSON output for CI/CD integration
  • F-6: Schema validation for all manifests
  • F-7: Comprehensive test suite with >75% coverage

Installation

Install from source:

git clone https://github.com/aatakansalar/llm-prompt-semantic-diff
cd llm-prompt-semantic-diff
python -m venv venv
source venv/bin/activate  # On Windows: venv\Scripts\activate
pip install -e .

Quick Start

1. Initialize a New Prompt

prompt init my-greeting

Creates my-greeting.prompt and my-greeting.pp.json with default structure.

2. Package an Existing Prompt

prompt pack my-prompt.prompt

Generates embeddings and creates a versioned manifest.

3. Compare Prompt Versions

# Human-readable output
prompt diff v1.pp.json v2.pp.json --threshold 0.8

# JSON output for CI/CD
prompt diff v1.pp.json v2.pp.json --json --threshold 0.8

Returns exit code 1 if similarity below threshold.

4. Validate Manifest Schema

prompt validate my-prompt.pp.json

Embedding Providers

Local (Default)

Uses SentenceTransformers with all-MiniLM-L6-v2 model:

prompt pack my-prompt.prompt --provider sentence-transformers

Cloud (OpenAI)

Requires OPENAI_API_KEY environment variable:

export OPENAI_API_KEY="your-api-key"
prompt pack my-prompt.prompt --provider openai

CI/CD Integration

Use --json flag for machine-readable output:

- name: Check prompt changes
  run: |
    prompt diff main.pp.json feature.pp.json --json --threshold 0.8
    if [ $? -eq 1 ]; then
      echo "Prompt changes exceed threshold - review required"
      exit 1
    fi

Manifest Format

Prompts are packaged into .pp.json files:

{
  "content": "Your prompt text here...",
  "version": "0.1.0",
  "embeddings": [0.1, -0.2, 0.3, ...],
  "description": "Optional description",
  "tags": ["category", "type"],
  "model": "gpt-4"
}

Example Workflow

# Create new prompt
prompt init greeting

# Edit greeting.prompt file
# ... make changes ...

# Package with embeddings
prompt pack greeting.prompt

# Create modified version
cp greeting.prompt greeting-v2.prompt
# ... make more changes ...
prompt pack greeting-v2.prompt

# Compare versions
prompt diff greeting.pp.json greeting-v2.pp.json

# Output:
# Semantic similarity: 85.2%
# Threshold: 80.0%
# Above threshold: Yes
# Version A: 0.1.0
# Version B: 0.1.0

Security & Privacy

  • Local-first: No data leaves your machine unless OpenAI provider is explicitly selected
  • API keys: Only read from environment variables (OPENAI_API_KEY)
  • No telemetry: No analytics, tracking, or hidden network calls

Development

git clone https://github.com/aatakansalar/llm-prompt-semantic-diff
cd llm-prompt-semantic-diff
python -m venv venv
source venv/bin/activate  # On Windows: venv\Scripts\activate
pip install -e ".[dev]"
pytest tests/ -v

License

Licensed under the MIT License. See LICENSE for details.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

llm_prompt_semantic_diff-0.1.0.tar.gz (10.8 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

llm_prompt_semantic_diff-0.1.0-py3-none-any.whl (9.9 kB view details)

Uploaded Python 3

File details

Details for the file llm_prompt_semantic_diff-0.1.0.tar.gz.

File metadata

File hashes

Hashes for llm_prompt_semantic_diff-0.1.0.tar.gz
Algorithm Hash digest
SHA256 8735fe450a8cc1c794daaf3496458b8702f98ed631b1eb9cabdabf73e5690dd9
MD5 b8e2db0ba5dbabc67483f9f7b7cd67eb
BLAKE2b-256 440f92d0c0a54daef97854207c4d76435ef51cedb27dff855b3126d027d3ad04

See more details on using hashes here.

File details

Details for the file llm_prompt_semantic_diff-0.1.0-py3-none-any.whl.

File metadata

File hashes

Hashes for llm_prompt_semantic_diff-0.1.0-py3-none-any.whl
Algorithm Hash digest
SHA256 27e79b8b60b9c3dfa40fd9888b3c8c0005f310c8fbaa9d4961fe79769ae408ed
MD5 e5bfca19085e991e206153cf37d27614
BLAKE2b-256 c7a0473ac2ad7a7fc639f87619617a49845b2df5670fb21c93f87e086b50c973

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page