Skip to main content

Prellm - Lightweight LLM prompt middleware for bias detection, standardization, and DevOps process chains via YAML config, supporting context injection and multi-provider orchestration.

Project description

🛡️ Prellm

Lightweight LLM prompt middleware — bias detection, standardization, and DevOps process chains via YAML config.

Prellm sits between your application and LLM providers, automatically detecting bias, ambiguity, and dangerous patterns in prompts. It enriches queries with context, validates outputs, and supports multi-step DevOps workflows with approval gates.

Features

  • Bias & Ambiguity Detection — regex + NLTK patterns for PL/EN, with DevOps-specific guardrails
  • YAML-Driven Config — declarative rules, clarification templates, model fallbacks
  • 100+ LLM Models — via LiteLLM proxy (OpenAI, Anthropic, Llama, Mistral, etc.)
  • DevOps Process Chains — multi-step workflows with approval gates, rollback, and audit trails
  • Context Injection — auto-enrich prompts with env vars, git info, system state
  • Type-Safe Outputs — Pydantic v2 validated responses
  • Lightweight — <50MB, 5 dependencies, async-first

Quick Start

# Install
pip install prellm

# Generate config
prellm init --devops -o rules.yaml

# Analyze a query (no LLM call)
prellm analyze "Deploy to production" --config rules.yaml

# Run with LLM
prellm run "Zdeployuj na staging" --config rules.yaml --model gpt-4o-mini

# Execute a process chain
prellm process deploy.yaml --guard-config rules.yaml --env production

Python API

from prellm import preLLM, ProcessChain

# Simple query
guard = preLLM("rules.yaml")
result = await guard("Deploy to production", model="gpt-4o-mini")
print(result.clarified)  # True — detected missing context
print(result.content)     # Enriched response

# Process chain
chain = ProcessChain("deploy.yaml")
result = await chain.execute(env="production", dry_run=True)
for step in result.steps:
    print(f"{step.step_name}: {step.status}")

Configuration

rules.yaml

bias_patterns:
  - regex: "(deploy|zdeployuj)\\s+(na|to)\\s+(prod|production)"
    action: clarify
    severity: critical
    description: "Production deployment  requires context"

clarify_template: "[KONTEKST]: Podaj szczegóły dla: {query}"
max_retries: 3
policy: devops

models:
  fallback: ["gpt-4o-mini", "llama3"]

context_sources:
  - env: [CLUSTER, NAMESPACE, GIT_SHA]
  - git: [branch, short_sha]

deploy.yaml (Process Chain)

process: deploy-production
steps:
  - name: pre-check
    prompt: "Check readiness of {CLUSTER}"
    approval: auto
  - name: deploy
    prompt: "Rolling deploy to {CLUSTER}/{NAMESPACE}"
    approval: manual
    rollback: true

Architecture

User Query → BiasDetector → ContextEngine → Enrichment → LiteLLM → Pydantic Validation → Response
                                                              ↑
                                        ProcessChain → Approval Gates → Audit Trail

Development

git clone https://github.com/softreck/prellm
cd prellm
poetry install
poetry run pytest

License

Apache License 2.0 - see LICENSE for details.

Author

Created by Tom Sapletta - tom@sapletta.com

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

prellm-0.1.15.tar.gz (12.5 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

prellm-0.1.15-py3-none-any.whl (15.9 kB view details)

Uploaded Python 3

File details

Details for the file prellm-0.1.15.tar.gz.

File metadata

  • Download URL: prellm-0.1.15.tar.gz
  • Upload date:
  • Size: 12.5 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.13.5

File hashes

Hashes for prellm-0.1.15.tar.gz
Algorithm Hash digest
SHA256 9ae0acebc6abda35601da654bcf48f6647350f437f797a58049c16168510266a
MD5 318a1266f2cfbc293a59cb26d0653f4b
BLAKE2b-256 8bb0ab0e03e86b607419c5a9dca2ba1ce7911e90a65d1c66a9156c6eb10f8790

See more details on using hashes here.

File details

Details for the file prellm-0.1.15-py3-none-any.whl.

File metadata

  • Download URL: prellm-0.1.15-py3-none-any.whl
  • Upload date:
  • Size: 15.9 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.13.5

File hashes

Hashes for prellm-0.1.15-py3-none-any.whl
Algorithm Hash digest
SHA256 5bedb49df67e9c23529d03b5b20cea5e1a7668e97f64b1f31935083fcca54716
MD5 14ec87672e7801d5e24938d99f1e56c8
BLAKE2b-256 311d40da890494377d738a6d89d17d0f716dd96f8fb6dab326cf730eaaf95041

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page