Drools-equivalent business rule engine reference implementation
Project description
sparkrules
A Drools-style business rule engine for Python. Define rules in DRL, evaluate facts, get explainable results — no JVM required.
from sparkrules.executor import RuleExecutor
result = RuleExecutor().run(
{"amount": 1500, "region": "US"},
'rule "high-value" when $f : Fact( amount > 1000 ) then result.risk = "high"; end',
)
print(result.fired) # True
print(result.action_output) # {'risk': 'high'}
Install
pip install sparkrules
Why SparkRules?
- No JVM, no Drools server — pure Python rule engine with DRL syntax, decision tables, and explainable outputs
- Production-ready API — FastAPI service with a browser-based Rules Workbench (Monaco editor, LSP diagnostics, simulations)
- Optional Spark integration — scale to cluster DataFrames with
apply_drl()when you need it, stay on pure Python when you don't - Governance built in — versioned rules, namespace scoping, dev→stage→prod promotion, deprecation workflows
- 100% test coverage — 521 tests, property-based testing with Hypothesis, enforced coverage gate
Performance
The default API path runs pure Python (no Spark). Typical throughput on a single core:
| Scenario | Throughput |
|---|---|
| Simple single-pattern rule | ~5,000-10,000 evals/sec |
| Multi-condition rules with actions | ~1,000-5,000 evals/sec |
| Full API round-trip (HTTP + parse + eval) | ~200-500 req/sec |
For higher throughput, use apply_drl() with PySpark to distribute evaluation across a cluster. See docs/BENCHMARKS.md for methodology and docs/KNOWN_LIMITATIONS.md for honest scope boundaries.
Quick start (from source)
git clone https://github.com/vaquarkhan/sparkrules.git
cd sparkrules
python -m pip install -e ".[test]"
python -c "import sparkrules; print('ok', sparkrules.__version__)"
pytest tests/ -q
Start the API + Workbench
pip install sparkrules
python -m uvicorn sparkrules.api.app:create_app --factory --host 127.0.0.1 --port 8042
Then open:
- OpenAPI docs: http://127.0.0.1:8042/docs
- Rules Workbench: http://127.0.0.1:8042/workbench/
Or use Docker:
docker compose up --build
# → http://127.0.0.1:8042/workbench/
Feature highlights
Rule authoring
- DRL-style
when/thenrule language with salience, agenda groups, activation groups - Decision tables with hit policies:
UNIQUE,FIRST,PRIORITY,COLLECT - XLSX decision-table import/export
- Rule templates with placeholder substitution
Execution
- Explainable outputs: bound fields, action outputs, reason codes
- Batch, two-pass, and streaming evaluation modes
- Deterministic replay with versioned rule snapshots
- Multi-pattern rules with local Cartesian expansion
API and Workbench
- FastAPI with OpenAPI docs, health endpoint, rule CRUD, simulations
- Rules Workbench — browser UI with Monaco DRL editor, validate + LSP diagnostics, simulation, deployment readout
- Shadow, coverage, counterfactual, and chain simulation modes
- Time-travel debug capture and replay
- Data quality checks (not-null, range, in-set)
Governance
- Versioned metadata store (in-memory, DuckDB, Iceberg, Postgres backends)
- Rule namespaces for multi-project scoping
- Dev → stage → prod promotion pins
- Deprecation workflow: propose, approve, enforce
Spark integration (optional)
apply_drl(df, drl)for cluster DataFrame evaluation viamapPartitions- Pure Python by default — Spark is opt-in, not required
- Platform config: local, AWS Glue, Databricks, GCP Dataproc, Azure Synapse
Spark vs Python: the API and Workbench run pure Python. For cluster-scale evaluation, see docs/SPARK_INTEGRATION.md.
How it works
- Author rules in DRL or decision-table form
- Parse and validate rule syntax
- Evaluate facts and produce explainable results
- Version, replay, and govern rule lifecycle
Architecture details: docs/HOW_IT_WORKS.md
Use cases
- POS end-of-day decisioning
- Streaming authorization logic
- Settlement replay and correction
- Underwriting decision support
- Clinical trial eligibility screening
Details: docs/USE_CASES.md · Examples: examples/
API key (optional)
Set SPARKRULES_API_KEY to require authentication on mutating endpoints and sensitive GETs. Public without key: /health, OpenAPI, Workbench static shell. See API run details.
Documentation
| Topic | Link |
|---|---|
| Features | docs/FEATURES.md |
| Architecture | docs/HOW_IT_WORKS.md |
| Developer guide | docs/DEVELOPER_GUIDE.md |
| Use cases | docs/USE_CASES.md |
| Spark integration | docs/SPARK_INTEGRATION.md |
| Governance | docs/GOVERNANCE.md |
| Benchmarks | docs/BENCHMARKS.md |
| Known limitations | docs/KNOWN_LIMITATIONS.md |
| Roadmap | docs/ROADMAP.md |
| Publishing / CI | docs/PUBLISHING.md |
| Contributing | CONTRIBUTING.md |
| Changelog | CHANGELOG.md |
License
Licensed under the Apache License, Version 2.0.
Copyright 2026 Vaquar Khan. See CITATION.cff for citation details.
Project details
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file sparkrules-1.0.0.tar.gz.
File metadata
- Download URL: sparkrules-1.0.0.tar.gz
- Upload date:
- Size: 114.7 kB
- Tags: Source
- Uploaded using Trusted Publishing? Yes
- Uploaded via: twine/6.1.0 CPython/3.13.12
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
ecb10da75a6dc5e340f6331677bddc73e25e21747fce7d8e0a984658406fe9b6
|
|
| MD5 |
b305f1fae8ab19562dc42fe4c1d5ced5
|
|
| BLAKE2b-256 |
92cd585afc131df6886a743de9fc4de252372b8ad36a3b046e380360db43b6fb
|
Provenance
The following attestation bundles were made for sparkrules-1.0.0.tar.gz:
Publisher:
pypi-release.yml on vaquarkhan/sparkrules
-
Statement:
-
Statement type:
https://in-toto.io/Statement/v1 -
Predicate type:
https://docs.pypi.org/attestations/publish/v1 -
Subject name:
sparkrules-1.0.0.tar.gz -
Subject digest:
ecb10da75a6dc5e340f6331677bddc73e25e21747fce7d8e0a984658406fe9b6 - Sigstore transparency entry: 1417308023
- Sigstore integration time:
-
Permalink:
vaquarkhan/sparkrules@75dc48ae01daa31bb70b7ea2d4fa75167d1c7047 -
Branch / Tag:
refs/heads/main - Owner: https://github.com/vaquarkhan
-
Access:
public
-
Token Issuer:
https://token.actions.githubusercontent.com -
Runner Environment:
github-hosted -
Publication workflow:
pypi-release.yml@75dc48ae01daa31bb70b7ea2d4fa75167d1c7047 -
Trigger Event:
workflow_dispatch
-
Statement type:
File details
Details for the file sparkrules-1.0.0-py3-none-any.whl.
File metadata
- Download URL: sparkrules-1.0.0-py3-none-any.whl
- Upload date:
- Size: 133.7 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? Yes
- Uploaded via: twine/6.1.0 CPython/3.13.12
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
6a916ed0ab20b6615d409a2a9cb09298629b532daa2cd8167229581e3bb68a55
|
|
| MD5 |
debc120773bc75ac5a0b6b81f2f214bb
|
|
| BLAKE2b-256 |
38d210756633dff2e9add5b576b05086f1e2d61b88a5a361cdfddf688cbc23ee
|
Provenance
The following attestation bundles were made for sparkrules-1.0.0-py3-none-any.whl:
Publisher:
pypi-release.yml on vaquarkhan/sparkrules
-
Statement:
-
Statement type:
https://in-toto.io/Statement/v1 -
Predicate type:
https://docs.pypi.org/attestations/publish/v1 -
Subject name:
sparkrules-1.0.0-py3-none-any.whl -
Subject digest:
6a916ed0ab20b6615d409a2a9cb09298629b532daa2cd8167229581e3bb68a55 - Sigstore transparency entry: 1417308029
- Sigstore integration time:
-
Permalink:
vaquarkhan/sparkrules@75dc48ae01daa31bb70b7ea2d4fa75167d1c7047 -
Branch / Tag:
refs/heads/main - Owner: https://github.com/vaquarkhan
-
Access:
public
-
Token Issuer:
https://token.actions.githubusercontent.com -
Runner Environment:
github-hosted -
Publication workflow:
pypi-release.yml@75dc48ae01daa31bb70b7ea2d4fa75167d1c7047 -
Trigger Event:
workflow_dispatch
-
Statement type: