Python adapter for zigrlm, a Zig runtime for Recursive Language Model workflows.
Project description
zigrlm
zigrlm is an experimental Zig runtime for Recursive Language Model (RLM)
workflows. It is directly inspired by
alexzhang13/rlm, the Python reference
implementation for Recursive Language Models.
Use zigrlm when you want one small CLI process to own an RLM run: the root
prompt, recursive child calls, bounded local compute, parallel fanout, trace
events, and usage accounting.
Install
The Python adapter is published as zigrlm:
pip install zigrlm
The Zig CLI is built from source:
git clone https://github.com/Hmbown/zigrlm.git
cd zigrlm
zig build
zig-out/bin/zigrlm --help
The Python adapter shells out to the Zig binary. After a source build it will
find zig-out/bin/zigrlm automatically from this checkout. In other layouts,
put the binary on PATH or set:
export ZIGRLM_BIN=/absolute/path/to/zigrlm
Relationship To RLM
The upstream RLM project is the conceptual reference:
- Upstream: https://github.com/alexzhang13/rlm
- Paper, blog, and docs are linked from that repository.
- Upstream package:
rlms, imported asrlm.
zigrlm is not a fork of that Python codebase. It reimplements the core
control loop in Zig and keeps the command shapes explicit:
- flat proxy commands for single model completions;
- recursive commands for RLM workflows;
- a
dszigcommand for DSPy-style signature experiments; - JSONL trace and metadata output for inspection.
If you want the full Python ecosystem, sandbox matrix, and upstream API, use
alexzhang13/rlm. If you want a compact Zig CLI for auditable local recursive
fanout, try zigrlm.
Quick Start
Run no-network checks:
zig build test
zig build run -- demo
zig build run -- echo "hello"
zig build run -- echo --trace /tmp/zigrlm-echo-trace.jsonl "hello"
Run a plain OpenAI completion through the flat proxy:
printf 'Return exactly OK.' \
| zig-out/bin/zigrlm openai-proxy --model gpt-5.4
Run a recursive Codex workflow:
zig-out/bin/zigrlm cli-codex --timeout-ms 600000 'Solve via a child:
```repl
rlm_query child = "Answer this briefly: " + context
FINAL_VAR(child)
```'
Run parallel child calls inside one RLM process:
zig-out/bin/zigrlm cli-codex --timeout-ms 600000 'Fan out:
```repl
rlm_query_batched answers = "Task A" | "Task B" | "Task C"
FINAL_VAR(answers)
```'
Command Families
Keep flat completions and recursive workflows separate.
| Command | Shape | Use when |
|---|---|---|
openai-proxy |
stdin to stdout | One plain OpenAI Responses completion. |
codex-proxy |
stdin to stdout | One plain Codex CLI completion. |
claude-proxy |
stdin to stdout | One plain Claude CLI completion. |
cli-openai |
recursive RLM | Root and child calls go through openai-proxy. |
cli-codex |
recursive RLM | Root and child calls go through codex-proxy. |
cli-claude |
recursive RLM | Root and child calls go through claude-proxy. |
zai |
recursive RLM | Root and child calls use Z.ai chat completions. |
cli |
recursive RLM | Root and child calls use custom stdin/stdout commands. |
dszig |
DSPy-style RLM | Signature inputs become Python variables and outputs use SUBMIT(...). |
echo, demo |
diagnostic | No-network runtime checks. |
Rule of thumb:
single completion? -> *-proxy
workflow with subcalls? -> cli-<backend>, zai, or cli
parallel children? -> rlm_query_batched inside one recursive command
custom backend? -> cli with ZIGRLM_MAIN_CMD / ZIGRLM_RLM_CMD
DSPy-style signature? -> dszig
The repl DSL
Recursive commands ask the root model to return fenced repl blocks:
```repl
let name = "text" + context
set name = expression
print(expression)
js name = "const n = context.length; FINAL(String(n));"
llm_query name = expression
rlm_query name = expression
llm_query_batched name = expr | expr | ...
rlm_query_batched name = expr | expr | ...
FINAL(expression)
FINAL_VAR(name)
```
Important details:
contextis the original user prompt for the current RLM frame.- Only fences tagged exactly as
replare executed. llm_queryis a direct child model call.rlm_querystarts a child RLM loop, unless the depth limit has been reached.- Batched query forms run independent calls concurrently and join responses as indexed blocks.
- A block should end with
FINAL(...)orFINAL_VAR(...).
DSzig Python Adapter
The PyPI package exposes a small Python adapter:
from dszig import RLM
rlm = RLM(
"question: str -> answer: str",
main_cmd="python3 -c 'print(\"```python\"); print(\"SUBMIT(answer=question.upper())\"); print(\"```\")'",
rlm_cmd="/bin/cat",
)
result = rlm(question="alpha")
assert result.answer == "ALPHA"
You can also import the same adapter through the package name:
from zigrlm import RLM
If you already have real DSPy imported and want to patch it explicitly:
import dspy
from dszig.dspy_compat import install
install(dspy)
The local python/dspy package in this repository is only a smoke-test shim.
It is not the full Stanford DSPy package and is not shipped as part of the PyPI
wheel.
Runtime Flags
Recursive commands support:
--file PATH
--stdin
--trace PATH
--metadata PATH
--timeout-ms N
--max-depth N
--max-iterations N
--max-concurrent-subcalls N
--max-runtime-ms N
--max-calls N
--max-input-bytes N
--max-output-bytes N
--max-consecutive-errors N
--environment script|python|docker-python
--python-bin PATH
--docker-bin PATH
--docker-image IMAGE
--docker-setup-timeout-ms N
--persistent
--compaction
--compaction-threshold-bytes N
Prefer trace paths under /tmp, for example:
--trace /tmp/zigrlm-my-run.jsonl
Traces can include prompt and model-output previews. Do not commit ad hoc traces that contain private prompts, outputs, or keys.
Configuration
Configuration is read from process environment first, then from a local .env.
Do not commit .env; it normally contains live keys.
Relevant variables:
OPENAI_API_KEY
OPENAI_BASE_URL
OPENAI_MAIN_MODEL
OPENAI_RLM_MODEL
CODEX_BIN
CODEX_MAIN_MODEL
CODEX_RLM_MODEL
CLAUDE_BIN
CLAUDE_MAIN_MODEL
CLAUDE_RLM_MODEL
CLAUDE_EFFORT
CLAUDE_MAIN_EFFORT
CLAUDE_RLM_EFFORT
ZAI_CODING_API_KEY
ZAI_CODING_BASE_URL
ZAI_MAIN_MODEL
ZAI_RLM_MODEL
ZIGRLM_BIN
ZIGRLM_MAIN_CMD
ZIGRLM_RLM_CMD
Generic cli backends should read the full prompt from stdin and write only the
completion text to stdout.
Project Layout
src/main.zig: CLI commands, provider wrappers, runtime option parsing.src/rlm.zig: RLM loop, child calls, batching, usage, JSONL tracing.src/dspy_rlm.zig: DSPy-style signature runner withSUBMIT(...).src/env.zig:replDSL execution and Nodevmbridge.src/python_env.zig: Python REPL subprocess with host callbacks.src/docker_env.zig: Docker-backed Python REPL with host callbacks.python/dszig: Python adapter forzigrlm dszig.docs/RLM_PARITY.md: current parity notes against upstream RLM/DSPy ideas.
Development
Run the main checks:
zig build test
zig build
zig test src/dspy_rlm.zig
zig test src/python_env.zig
python3 -m py_compile scripts/compare_rlm_main.py scripts/run_oolong.py python/dszig/*.py python/dspy/*.py python/dspy/predict/*.py python/tests/*.py
PYTHONPATH=python python3 -m unittest discover -s python/tests -v
Build and validate the Python package:
python3 -m build
python3 -m twine check --strict dist/*
Safety Notes
- The JavaScript sandbox uses Node
vm. Treat it as trusted local compute, not a hard security boundary. claude-proxydisables Claude Code tools by default. Use--allow-toolsonly when tool access is intentional.--environment docker-pythonruns Python in a container with--network noneby default, but the host process still controls model calls and mounted state.--timeout-msbounds individual subprocess calls, not the whole recursive run.
License
MIT. See LICENSE.
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file zigrlm-0.1.0.tar.gz.
File metadata
- Download URL: zigrlm-0.1.0.tar.gz
- Upload date:
- Size: 77.9 kB
- Tags: Source
- Uploaded using Trusted Publishing? Yes
- Uploaded via: twine/6.1.0 CPython/3.13.12
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
ddafffba70866d6c1040a340fd23058f659b41964e8154b3822ed22722ecdd23
|
|
| MD5 |
b24b72697d5760bcfdab30913f79a53a
|
|
| BLAKE2b-256 |
a919ae3fda7717e01977061e4d97b9837f205bdee9430c8382a6da4e7b2f94ec
|
Provenance
The following attestation bundles were made for zigrlm-0.1.0.tar.gz:
Publisher:
publish.yml on Hmbown/zigrlm
-
Statement:
-
Statement type:
https://in-toto.io/Statement/v1 -
Predicate type:
https://docs.pypi.org/attestations/publish/v1 -
Subject name:
zigrlm-0.1.0.tar.gz -
Subject digest:
ddafffba70866d6c1040a340fd23058f659b41964e8154b3822ed22722ecdd23 - Sigstore transparency entry: 1350130538
- Sigstore integration time:
-
Permalink:
Hmbown/zigrlm@fc1441054d9ceb24f09ef8f70933c14accc165df -
Branch / Tag:
refs/tags/v0.1.0 - Owner: https://github.com/Hmbown
-
Access:
public
-
Token Issuer:
https://token.actions.githubusercontent.com -
Runner Environment:
github-hosted -
Publication workflow:
publish.yml@fc1441054d9ceb24f09ef8f70933c14accc165df -
Trigger Event:
push
-
Statement type:
File details
Details for the file zigrlm-0.1.0-py3-none-any.whl.
File metadata
- Download URL: zigrlm-0.1.0-py3-none-any.whl
- Upload date:
- Size: 8.5 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? Yes
- Uploaded via: twine/6.1.0 CPython/3.13.12
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
ed96ecc10dcc39566f612cc847a2d47a3b81a6116c8120ad0758feff5fd3f63b
|
|
| MD5 |
b87905ee351c1f58ad1eb1b3938680b0
|
|
| BLAKE2b-256 |
01106b966f12705dc0b4e5773858c9c71061b97813d9fdb6c94eae7ce0b21440
|
Provenance
The following attestation bundles were made for zigrlm-0.1.0-py3-none-any.whl:
Publisher:
publish.yml on Hmbown/zigrlm
-
Statement:
-
Statement type:
https://in-toto.io/Statement/v1 -
Predicate type:
https://docs.pypi.org/attestations/publish/v1 -
Subject name:
zigrlm-0.1.0-py3-none-any.whl -
Subject digest:
ed96ecc10dcc39566f612cc847a2d47a3b81a6116c8120ad0758feff5fd3f63b - Sigstore transparency entry: 1350130675
- Sigstore integration time:
-
Permalink:
Hmbown/zigrlm@fc1441054d9ceb24f09ef8f70933c14accc165df -
Branch / Tag:
refs/tags/v0.1.0 - Owner: https://github.com/Hmbown
-
Access:
public
-
Token Issuer:
https://token.actions.githubusercontent.com -
Runner Environment:
github-hosted -
Publication workflow:
publish.yml@fc1441054d9ceb24f09ef8f70933c14accc165df -
Trigger Event:
push
-
Statement type: