Skip to main content

preLLM — Small LLM decomposition middleware for prompt preprocessing, bias detection, DevOps process chains, and multi-provider orchestration via YAML config.

Project description

🛡️ Prellm

Lightweight LLM prompt middleware — bias detection, standardization, and DevOps process chains via YAML config.

Prellm sits between your application and LLM providers, automatically detecting bias, ambiguity, and dangerous patterns in prompts. It enriches queries with context, validates outputs, and supports multi-step DevOps workflows with approval gates.

Features

  • Bias & Ambiguity Detection — regex + NLTK patterns for PL/EN, with DevOps-specific guardrails
  • YAML-Driven Config — declarative rules, clarification templates, model fallbacks
  • 100+ LLM Models — via LiteLLM proxy (OpenAI, Anthropic, Llama, Mistral, etc.)
  • DevOps Process Chains — multi-step workflows with approval gates, rollback, and audit trails
  • Context Injection — auto-enrich prompts with env vars, git info, system state
  • Type-Safe Outputs — Pydantic v2 validated responses
  • Lightweight — <50MB, 5 dependencies, async-first

Quick Start

# Install
pip install prellm

# Generate config
prellm init --devops -o rules.yaml

# Analyze a query (no LLM call)
prellm analyze "Deploy to production" --config rules.yaml

# Run with LLM
prellm run "Zdeployuj na staging" --config rules.yaml --model gpt-4o-mini

# Execute a process chain
prellm process deploy.yaml --guard-config rules.yaml --env production

Python API

from prellm import preLLM, ProcessChain

# Simple query
guard = preLLM("rules.yaml")
result = await guard("Deploy to production", model="gpt-4o-mini")
print(result.clarified)  # True — detected missing context
print(result.content)     # Enriched response

# Process chain
chain = ProcessChain("deploy.yaml")
result = await chain.execute(env="production", dry_run=True)
for step in result.steps:
    print(f"{step.step_name}: {step.status}")

Configuration

rules.yaml

bias_patterns:
  - regex: "(deploy|zdeployuj)\\s+(na|to)\\s+(prod|production)"
    action: clarify
    severity: critical
    description: "Production deployment  requires context"

clarify_template: "[KONTEKST]: Podaj szczegóły dla: {query}"
max_retries: 3
policy: devops

models:
  fallback: ["gpt-4o-mini", "llama3"]

context_sources:
  - env: [CLUSTER, NAMESPACE, GIT_SHA]
  - git: [branch, short_sha]

deploy.yaml (Process Chain)

process: deploy-production
steps:
  - name: pre-check
    prompt: "Check readiness of {CLUSTER}"
    approval: auto
  - name: deploy
    prompt: "Rolling deploy to {CLUSTER}/{NAMESPACE}"
    approval: manual
    rollback: true

Architecture

User Query → BiasDetector → ContextEngine → Enrichment → LiteLLM → Pydantic Validation → Response
                                                              ↑
                                        ProcessChain → Approval Gates → Audit Trail

Development

git clone https://github.com/softreck/prellm
cd prellm
poetry install
poetry run pytest

License

Apache License 2.0 - see LICENSE for details.

Author

Created by Tom Sapletta - tom@sapletta.com

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

prellm-0.2.1.tar.gz (19.2 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

prellm-0.2.1-py3-none-any.whl (24.1 kB view details)

Uploaded Python 3

File details

Details for the file prellm-0.2.1.tar.gz.

File metadata

  • Download URL: prellm-0.2.1.tar.gz
  • Upload date:
  • Size: 19.2 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.13.5

File hashes

Hashes for prellm-0.2.1.tar.gz
Algorithm Hash digest
SHA256 b4b4543d65cceb3f89488455316ad2ae91e351c2fab708165ed081d9f6263937
MD5 730fa7b4dff7d9d912a5e1886eb51f92
BLAKE2b-256 302a272502d8f87495f04c380493b11d7596c2bbfb7917db9e77e27a0082783b

See more details on using hashes here.

File details

Details for the file prellm-0.2.1-py3-none-any.whl.

File metadata

  • Download URL: prellm-0.2.1-py3-none-any.whl
  • Upload date:
  • Size: 24.1 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.13.5

File hashes

Hashes for prellm-0.2.1-py3-none-any.whl
Algorithm Hash digest
SHA256 ea735d6a7e02e02d3c7583bcde02a4bd2e7d99f8169fe75b3d655cdeb8bf7014
MD5 e14f4f637bd9f391cac4210265f5a73e
BLAKE2b-256 64638338f8cdf1d51cd8dcfa2c6d424d3049440acb7031714a5ee71646434e91

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page