Skip to main content

MCP server for recursive LLM reasoning—load context, iterate with search/code/think tools, converge on answers

Project description

Aleph

License: MIT Python 3.10+ PyPI version

Aleph is an MCP server for working on large repos, logs, and documents without stuffing them into the model prompt. It keeps the working context in a Python process, then exposes tools so the model can search, peek, run code, recurse, and return small derived results.

Recommended default: install the Codex CLI, then run aleph-rlm install. Aleph still works with Claude Code, Cursor, VS Code, and other MCP clients, but Codex is the cleanest shared-session sub-query path today.

Why Aleph:

  • Load context once instead of pasting it over and over.
  • Compute inside Aleph memory with exec_python instead of leaking raw data back through the prompt.
  • Use recursive sub-queries and recipes when a single pass is not enough.
  • Save sessions and resume long investigations later.
+-----------------+    tool calls     +--------------------------+
|   LLM client    | ---------------> |  Aleph (Python process)  |
| (context budget)| <--------------- |  search / peek / exec    |
+-----------------+   small results  +--------------------------+

Quick Start

  1. Install Aleph:
pip install "aleph-rlm[mcp]"
  1. Configure your MCP client:
aleph-rlm install
  1. Verify Aleph is reachable in your assistant:
get_status()
# or
list_contexts()
  1. Run the skill flow on a real file:
/aleph /absolute/path/to/file.log
# or in Codex CLI
$aleph /absolute/path/to/file.log

The shortcut command is optional. If you want /aleph or $aleph, install docs/prompts/aleph.md in your client's command/skill folder. Exact paths are in MCP_SETUP.md.

First Workflow

Aleph is best when you load data once, do the heavy work inside Aleph, and only pull back compact answers.

load_file(path="/absolute/path/to/large_file.log", context_id="doc")
search_context(pattern="ERROR|WARN", context_id="doc")
peek_context(start=1, end=60, unit="lines", context_id="doc")
exec_python(code="""
errors = [line for line in ctx.splitlines() if "error" in line.lower()]
result = {
    "error_count": len(errors),
    "first_error": errors[0] if errors else None,
}
""", context_id="doc")
get_variable(name="result", context_id="doc")
save_session(context_id="doc", path=".aleph/doc.json")

The important habit is to compute server-side. Do not treat get_variable("ctx") as the default path. Search, filter, chunk, or summarize first, then retrieve a small result.

If you want terminal-only mode instead of MCP, use:

aleph run "Summarize this log" --provider cli --model codex --context-file app.log

Common Workloads

Scenario What Aleph Is Good At
Large log analysis Load big files, trace patterns, correlate events
Codebase navigation Search symbols, inspect routes, trace behavior
Data exploration Analyze JSON, CSV, and mixed text with Python helpers
Long document review Load PDFs, Word docs, HTML, and compressed logs
Recursive investigations Split work into sub-queries instead of one giant prompt
Long-running sessions Save and resume memory packs across sessions

Core Tools

Category Primary tools What they do
Load context load_context, load_file, list_contexts, diff_contexts Put data into Aleph memory and inspect what is loaded
Navigate search_context, semantic_search, peek_context, chunk_context, rg_search Find the relevant slice before asking for an answer
Compute exec_python, get_variable Run code over the full context and retrieve only the derived result
Reason think, evaluate_progress, get_evidence, finalize Structure progress and close out with evidence
Orchestrate configure, validate_recipe, estimate_recipe, run_recipe, run_recipe_code Switch backends and automate repeated reasoning patterns
Persist save_session, load_session Keep long investigations outside the prompt window

Inside exec_python, Aleph also exposes helpers such as search(...), chunk(...), lines(...), sub_query(...), sub_query_batch(...), and sub_aleph(...). Recursive helpers live inside the REPL, not as top-level MCP tools.

Safety Model

Aleph is built to keep raw context out of the model window unless you explicitly pull it back:

  • Tool responses are capped and truncated.
  • get_variable("ctx") is policy-aware and should not be your default path.
  • exec_python stdout, stderr, and return values are bounded independently.
  • ALEPH_CONTEXT_POLICY=isolated adds stricter session export/import rules and more defensive defaults.

The safest pattern is always:

  1. Load the large context into Aleph memory.
  2. Search or compute inside Aleph.
  3. Retrieve only the small result you need.

Docs Map

Development

git clone https://github.com/Hmbown/aleph.git
cd aleph
pip install -e ".[dev,mcp]"
pytest tests/ -v
ruff check aleph/ tests/

References

License

MIT

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

aleph_rlm-0.8.9.tar.gz (318.5 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

aleph_rlm-0.8.9-py3-none-any.whl (145.0 kB view details)

Uploaded Python 3

File details

Details for the file aleph_rlm-0.8.9.tar.gz.

File metadata

  • Download URL: aleph_rlm-0.8.9.tar.gz
  • Upload date:
  • Size: 318.5 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.13.7

File hashes

Hashes for aleph_rlm-0.8.9.tar.gz
Algorithm Hash digest
SHA256 b89e39f943736a1fd54247c530b761cd29998866a171d04c3cd06241355ebba3
MD5 2463bde77246b5d9cd98a3a918faceb2
BLAKE2b-256 639aeee5bec46f6ade5faa80c035916bde25dd8c75269c26b0a67c074921b36f

See more details on using hashes here.

Provenance

The following attestation bundles were made for aleph_rlm-0.8.9.tar.gz:

Publisher: publish.yml on Hmbown/aleph

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file aleph_rlm-0.8.9-py3-none-any.whl.

File metadata

  • Download URL: aleph_rlm-0.8.9-py3-none-any.whl
  • Upload date:
  • Size: 145.0 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.13.7

File hashes

Hashes for aleph_rlm-0.8.9-py3-none-any.whl
Algorithm Hash digest
SHA256 bf7969b334f4718bdb13291909426498a44118f4b2aec371b89f0fbf8c4a8905
MD5 46ff4c46fccfda4e645cf3247fcda80e
BLAKE2b-256 4b694854587284e8940f9b4dbc28f079ca5a0cbe32ce13494a4d3243c4e5ea35

See more details on using hashes here.

Provenance

The following attestation bundles were made for aleph_rlm-0.8.9-py3-none-any.whl:

Publisher: publish.yml on Hmbown/aleph

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page