EBRM System — a full reasoning pipeline with hierarchical latents, adaptive test-time compute, symbolic verification, and self-consistency voting.
Project description
ebrm-system
Energy-Based Reasoning Machine — the system. A production reasoning pipeline: intent routing, adaptive test-time compute, energy-based scoring, external verifier bridge, and self-consistency voting.
This repository is the system layer on top of the ebrm model (research / paper reference). ebrm-system is the framework you deploy — ebrm is the model you cite.
Why this exists
Modern reasoning LLMs are strong but unverifiable: they emit plausible chains that are hard to check mechanically. ebrm-system wraps any base reasoner with a pipeline that makes answers auditable, budget-aware, and consistency-checked.
Architecture
query
│
▼
┌──────────────────────────┐
│ 1. Intent Classifier │ rule-based or neural; emits difficulty + budget
└──────────────────────────┘
│
▼
┌──────────────────────────┐
│ 2. Hierarchical Reasoner │ latent-thought inner loop (Coconut-inspired)
└──────────────────────────┘
│
▼
┌──────────────────────────┐
│ 3. Adaptive Langevin │ steps scale with difficulty; K parallel traces
└──────────────────────────┘
│
▼
┌──────────────────────────┐
│ 4. Process Reward Model │ stepwise energy → trace confidence
└──────────────────────────┘
│
▼
┌──────────────────────────┐
│ 5. External Verifier │ SymPy / sandboxed exec / regex — mechanical check
└──────────────────────────┘
│
▼
┌──────────────────────────┐
│ 6. Self-Consistency Vote │ weighted by confidence or 1/energy
└──────────────────────────┘
│
▼
answer + audit trail
Every stage is a swappable component behind a Protocol. The verifier layer never hallucinates — it only confirms what SymPy / Python / regex can mechanically check.
Install
pip install ebrm-system
From source:
git clone https://github.com/piyushptiwari1/ebrm-system
cd ebrm-system
pip install -e ".[dev]"
Quick start
# See what the intent router thinks about a query
ebrm-system classify "Solve: 3x + 7 = 22"
# Verify an answer mechanically
ebrm-system verify "x**2 + 2*x + 1" "(x+1)**2"
Python API:
from ebrm_system.intent import RuleBasedClassifier
from ebrm_system.verifiers import SymPyVerifier, VerifierChain
from ebrm_system.voting import Candidate, SelfConsistencyVoter
clf = RuleBasedClassifier()
pred = clf.classify("Solve: 3x + 7 = 22")
# pred.suggested_langevin_steps, pred.suggested_trace_count, ...
chain = VerifierChain([SymPyVerifier()])
results = chain.verify("5", {"expected": "5"})
assert chain.all_passed(results)
voter = SelfConsistencyVoter(numerical=True, tolerance=0.01, weight_by="inverse_energy")
result = voter.vote([
Candidate(answer=5.0, energy=-2.0),
Candidate(answer=5.0, energy=-1.5),
Candidate(answer=4.0, energy= 3.0),
])
# result.answer == 5.0, weighted by low energy
Components
| Module | Status | Purpose |
|---|---|---|
ebrm_system.intent |
✅ stable | Intent + difficulty + compute budget |
ebrm_system.verifiers |
✅ stable | SymPy / exec / regex / Lean / DRI + intent routing |
ebrm_system.voting |
✅ stable | Self-consistency with weighted bucketing |
ebrm_system.inference |
✅ stable | Langevin candidates, QJL, TurboQuant KV + attention |
ebrm_system.reward |
✅ stable | LatentIndex (QJL-backed nearest-neighbour reward) |
ebrm_system.core |
✅ stable | HierarchicalLatentReasoner — end-to-end orchestrator |
Development
pip install -e ".[dev]"
pytest # run tests
ruff check . # lint
mypy src # type-check
pre-commit install # optional hooks
CI runs lint + type + test on Python 3.10/3.11/3.12/3.13. See .github/workflows/ci.yml.
Design principles
- Mechanical over mystical — verifiers confirm with SymPy / exec / regex; never an LLM grading an LLM.
- Budget-aware — easy queries don't pay for hard-query compute. Intent routing controls Langevin steps, restarts, and trace count.
- Audit-first — every candidate carries its trace, energy, and verifier evidence.
- Swappable — everything is a Protocol. Swap the rule-based classifier for a neural one; swap SymPy for Z3; drop in your own voter.
Citation
If you use this system in academic work, please cite the model paper:
@software{ebrm_system_2026,
author = {Tiwari, Piyush},
title = {ebrm-system: An Energy-Based Reasoning Machine pipeline},
year = {2026},
url = {https://github.com/piyushptiwari1/ebrm-system}
}
License
Apache 2.0. See LICENSE.
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file ebrm_system-0.8.0.tar.gz.
File metadata
- Download URL: ebrm_system-0.8.0.tar.gz
- Upload date:
- Size: 63.3 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.2.0 CPython/3.10.12
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
36cf2c27be911c1112db4cba1b9a8581d5ba8105a5a8e4726997df9c744a7d74
|
|
| MD5 |
ea68119be723875a8d47ad9aba14b731
|
|
| BLAKE2b-256 |
3449feeb89837cec548913f0dadd720d1d7330cc84c46c8ad07c9ed52a377b2b
|
File details
Details for the file ebrm_system-0.8.0-py3-none-any.whl.
File metadata
- Download URL: ebrm_system-0.8.0-py3-none-any.whl
- Upload date:
- Size: 60.6 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.2.0 CPython/3.10.12
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
b20303140335ab6b3001565e011d86930c492ab35086682f3ecb340d2ec22d36
|
|
| MD5 |
92bac86d734aba189ada96e0bf439751
|
|
| BLAKE2b-256 |
072a9f3cb0c75830f1f10b650f3ff3f8d7d9a1d60618fcd44967ff85dc4148fc
|