Skip to main content

PromptGuard - Lightweight LLM prompt middleware for bias detection, standardization, and DevOps process chains via YAML config, supporting context injection and multi-provider orchestration.

Project description

🛡️ PromptGuard

Lightweight LLM prompt middleware — bias detection, standardization, and DevOps process chains via YAML config.

PromptGuard sits between your application and LLM providers, automatically detecting bias, ambiguity, and dangerous patterns in prompts. It enriches queries with context, validates outputs, and supports multi-step DevOps workflows with approval gates.

Features

  • Bias & Ambiguity Detection — regex + NLTK patterns for PL/EN, with DevOps-specific guardrails
  • YAML-Driven Config — declarative rules, clarification templates, model fallbacks
  • 100+ LLM Models — via LiteLLM proxy (OpenAI, Anthropic, Llama, Mistral, etc.)
  • DevOps Process Chains — multi-step workflows with approval gates, rollback, and audit trails
  • Context Injection — auto-enrich prompts with env vars, git info, system state
  • Type-Safe Outputs — Pydantic v2 validated responses
  • Lightweight — <50MB, 5 dependencies, async-first

Quick Start

# Install
pip install promptguard

# Generate config
promptguard init --devops -o rules.yaml

# Analyze a query (no LLM call)
promptguard analyze "Deploy to production" --config rules.yaml

# Run with LLM
promptguard run "Zdeployuj na staging" --config rules.yaml --model gpt-4o-mini

# Execute a process chain
promptguard process deploy.yaml --guard-config rules.yaml --env production

Python API

from promptguard import PromptGuard, ProcessChain

# Simple query
guard = PromptGuard("rules.yaml")
result = await guard("Deploy to production", model="gpt-4o-mini")
print(result.clarified)  # True — detected missing context
print(result.content)     # Enriched response

# Process chain
chain = ProcessChain("deploy.yaml")
result = await chain.execute(env="production", dry_run=True)
for step in result.steps:
    print(f"{step.step_name}: {step.status}")

Configuration

rules.yaml

bias_patterns:
  - regex: "(deploy|zdeployuj)\\s+(na|to)\\s+(prod|production)"
    action: clarify
    severity: critical
    description: "Production deployment  requires context"

clarify_template: "[KONTEKST]: Podaj szczegóły dla: {query}"
max_retries: 3
policy: devops

models:
  fallback: ["gpt-4o-mini", "llama3"]

context_sources:
  - env: [CLUSTER, NAMESPACE, GIT_SHA]
  - git: [branch, short_sha]

deploy.yaml (Process Chain)

process: deploy-production
steps:
  - name: pre-check
    prompt: "Check readiness of {CLUSTER}"
    approval: auto
  - name: deploy
    prompt: "Rolling deploy to {CLUSTER}/{NAMESPACE}"
    approval: manual
    rollback: true

Architecture

User Query → BiasDetector → ContextEngine → Enrichment → LiteLLM → Pydantic Validation → Response
                                                              ↑
                                        ProcessChain → Approval Gates → Audit Trail

Development

git clone https://github.com/softreck/promptguard
cd promptguard
poetry install
poetry run pytest

License

Apache License 2.0 - see LICENSE for details.

Author

Created by Tom Sapletta - tom@sapletta.com

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

prellm-0.1.12.tar.gz (12.5 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

prellm-0.1.12-py3-none-any.whl (16.0 kB view details)

Uploaded Python 3

File details

Details for the file prellm-0.1.12.tar.gz.

File metadata

  • Download URL: prellm-0.1.12.tar.gz
  • Upload date:
  • Size: 12.5 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.13.5

File hashes

Hashes for prellm-0.1.12.tar.gz
Algorithm Hash digest
SHA256 077beac9f1244bf935046ff461bf6c1d49dbf0c4bb1690d93323a5bef35dcd2c
MD5 ee79084c7c61011bd841909a5e26f705
BLAKE2b-256 52aaeab83def1b8e998796b7072a578e07dc71cc7f8f41051aefc2f9daf91f9d

See more details on using hashes here.

File details

Details for the file prellm-0.1.12-py3-none-any.whl.

File metadata

  • Download URL: prellm-0.1.12-py3-none-any.whl
  • Upload date:
  • Size: 16.0 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.13.5

File hashes

Hashes for prellm-0.1.12-py3-none-any.whl
Algorithm Hash digest
SHA256 80448b2cf2da473b8c46f5568ed361eb76afdfe568477aac41837e0d58302ae0
MD5 c0d4238722593c2e2f314e9bd719fb19
BLAKE2b-256 736e07cd20cdf753aa279f87d6ba97e5401ae84d73f62b60cbe19c6faa301394

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page