Composable AI safety pipeline framework with industry compliance packs
Project description
GuardrailGraph
Composable AI safety pipeline framework — define guardrails as a DAG of checks that work across any LLM provider, with industry-specific compliance packs for HIPAA, SOX, GDPR, and FedRAMP.
Why GuardrailGraph?
Every enterprise deploying LLMs needs guardrails. Current options are either provider-locked (Bedrock Guardrails), complex (NeMo Guardrails), or limited (Guardrails AI). GuardrailGraph is the first framework that combines:
- Composable DAG execution — checks run in parallel for low latency
- Provider agnostic — works with Bedrock, OpenAI, Anthropic, or any LLM
- Industry compliance packs — HIPAA, SOX, GDPR out of the box
- Serverless-native — designed for AWS Lambda from day one
- Simple API —
@checkdecorator +pipeline()builder
Installation
# Python
pip install substrai-guardrailgraph
# npm (TypeScript/JavaScript)
npm install @substrai/guardrailgraph
Quick Start
5-Minute Setup
from guardrailgraph import pipeline, check, Action
from guardrailgraph.checks import pii_check, toxicity_check, injection_check
# Create a pipeline with built-in checks
my_pipeline = pipeline(
name="my-app",
checks=[
pii_check(action=Action.REDACT),
toxicity_check(threshold=0.7),
injection_check(),
],
mode="fail-closed",
)
# Run guardrails on any text
result = my_pipeline.run("User input here")
if result.allowed:
# Safe to forward to LLM
text = result.modified_text or "User input here"
else:
# Content blocked
print(f"Blocked: {result.action.value}")
Custom Checks
from guardrailgraph import check, Action
@check(name="profanity", action=Action.BLOCK, threshold=0.7)
def check_profanity(text: str) -> dict:
"""Custom profanity detection."""
bad_words = ["badword1", "badword2"]
found = [w for w in bad_words if w in text.lower()]
return {
"detected": len(found) > 0,
"confidence": min(len(found) / 2.0, 1.0),
"matched": found,
}
Industry Compliance Packs
from guardrailgraph import pipeline
from guardrailgraph.packs import hipaa, financial
# HIPAA-compliant healthcare chatbot
healthcare = pipeline(
name="patient-assistant",
packs=[hipaa.full()],
)
# SOX-compliant financial advisor
finance = pipeline(
name="investment-advisor",
packs=[financial.sox()],
mode="fail-closed",
)
Middleware Integration
from guardrailgraph.middleware import guardrail
@guardrail(pipeline=my_pipeline)
def call_llm(prompt: str) -> str:
"""Your LLM call — automatically wrapped with guardrails."""
import boto3
client = boto3.client("bedrock-runtime")
# ... invoke model ...
return response
YAML Configuration
# guardrailgraph.yaml
project:
name: "my-app-guardrails"
version: "1.0.0"
pipeline:
mode: fail-closed
timeout_ms: 500
parallel: true
checks:
- name: pii-detection
type: builtin/pii
action: redact
config:
entity_types: [SSN, PHONE, EMAIL, CREDIT_CARD]
- name: toxicity
type: builtin/toxicity
action: block
config:
threshold: 0.7
- name: prompt-injection
type: builtin/injection
action: block
config:
sensitivity: high
CLI
# Scaffold a new project
guardrailgraph init my-project
guardrailgraph init my-project --pack hipaa
# Development
guardrailgraph dev # Interactive testing
guardrailgraph test # Run tests
guardrailgraph test --adversarial # Adversarial suite
guardrailgraph validate # Validate config
Built-in Checks
| Check | Description | Default Action |
|---|---|---|
pii_check() |
Detects SSN, phone, email, credit card, IP | REDACT |
toxicity_check() |
Scores hate, violence, sexual, self-harm | BLOCK |
topic_check() |
Block/allow specific topics | BLOCK |
injection_check() |
Prompt injection defense | BLOCK |
cost_check() |
Token/cost limits per request | BLOCK |
Architecture
Input → [Check 1] ──→ [Check 2] ──→ [Check 3]
(parallel) (parallel) (parallel)
↓ ↓ ↓
[PASS/BLOCK/REDACT/FLAG_FOR_REVIEW]
↓
[Final Decision + Audit Log]
Checks execute as a DAG (directed acyclic graph). Independent checks run in parallel for minimum latency. Dependent checks run sequentially.
Integration with LambdaLLM
from lambdallm import handler, Model
from guardrailgraph import pipeline
from guardrailgraph.packs import hipaa
@handler(
model=Model.CLAUDE_3_SONNET,
guardrails=pipeline(packs=[hipaa.full()]),
)
def lambda_handler(event, context):
return context.invoke("Answer: {q}", q=event["body"]["question"])
Comparison
| Feature | Bedrock Guardrails | NeMo | Guardrails AI | GuardrailGraph |
|---|---|---|---|---|
| Provider agnostic | ❌ | ❌ | Partial | ✅ |
| Composable DAG | ❌ | ❌ | ❌ | ✅ |
| Industry packs | ❌ | ❌ | ❌ | ✅ |
| Serverless-native | Managed | ❌ | ❌ | ✅ |
| Custom checks | Limited | Complex | Yes | ✅ Simple |
| Open source | ❌ | ✅ | ✅ | ✅ MIT |
License
MIT © Gaurav Kumar Sinha
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file substrai_guardrailgraph-0.2.0.tar.gz.
File metadata
- Download URL: substrai_guardrailgraph-0.2.0.tar.gz
- Upload date:
- Size: 60.2 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.2.0 CPython/3.14.4
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
28c79ca207f84c714a1952cacd698eae508a56da2f8782f99024bf644f947d5f
|
|
| MD5 |
372b4c47224e414481e016f676fc69a7
|
|
| BLAKE2b-256 |
8d5932e25d2ce4179f9a8b340c7007a697355b745948d1827b38c5790e20a000
|
File details
Details for the file substrai_guardrailgraph-0.2.0-py3-none-any.whl.
File metadata
- Download URL: substrai_guardrailgraph-0.2.0-py3-none-any.whl
- Upload date:
- Size: 63.6 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.2.0 CPython/3.14.4
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
ef5d99831ca41a2b6e15b2b4ab28aa771bad44843b2a0e817afa40e2ae478307
|
|
| MD5 |
adfcb19b3f1f0ca4f71b48e976952090
|
|
| BLAKE2b-256 |
b1d2be2b3148b614eff282240d573ab30fc0bd68aa75801ad140b0d3221499ec
|