Run N AI agents that collaborate with you on a single text file, safely. Powered by cotype.
Project description
chorale
Run N AI agents that brainstorm with you on a single text file, safely. Mix and match Claude, Gemini, Codex, Ollama, and your own.
You write under ## user. Each AI agent owns its own ## agent:<role> section and edits it in place to reply. The file is the conversation — no chat windows, no scrolling transcripts, no lost edits. Concurrent saves are reconciled by cotype's 3-way merge; the harness splices each agent's reply into ONLY its own section's bytes, so two agents editing two different sections cannot conflict by construction.
Three Claude personas (cook, logistics, ux-designer) plus a note-taker design a school crêpe stand with the user — all in one brainstorm.md. Same demo applies to the multi-backend mode below; mix in gemini, codex, or ollama per role.
pip install chorale
# all-claude (default)
chorale brainstorm.md cook logistics ux-designer note-taker
# mix four different brains in one chorale
chorale brainstorm.md \
cook \
logistics@gemini \
ux-designer@codex:gpt-5 \
note-taker@ollama:llama3
Edit brainstorm.md in any editor (with cotype-mode for live updates in Emacs); agents see your saves on their next poll and respond.
Why this exists
Long sessions with AI agents drift into chat transcripts that scroll away from the work you actually want at the end. chorale flips it: the document accumulates in place, every actor has a labelled section, and disagreements between actors surface as inline diff3 markers rather than lost work.
The tool was extracted from cotype's examples/headless-agents.sh — that bash script is still the readable "what's the idea, on one screen" demo; this Python rewrite is the production-friendly version: tested, configurable, extensible.
Install
pip install chorale
Requires Python ≥ 3.11, cotype (auto-installed), and at least one supported AI CLI on PATH (claude, gemini, codex, ollama, or your own — see Backends below).
Usage
chorale FILE ROLE_SPEC [ROLE_SPEC ...] [OPTIONS]
A role spec is one of:
| Form | Meaning |
|---|---|
cook |
default backend, default model |
cook@gemini |
gemini, gemini's default model |
cook@gemini:gemini-2.5-pro |
gemini, specific model |
cook@my-local |
a backend you defined in the config file |
Examples:
# four claude agents on a fresh brainstorm
chorale brainstorm.md cook logistics ux-designer note-taker
# mix brains: each role uses a different CLI
chorale brainstorm.md \
cook \
logistics@gemini \
ux-designer@codex:gpt-5 \
note-taker@ollama
# override the default backend for the whole run
chorale notes.md reviewer linter --default-backend gemini
# tighter polling, faster turns
chorale notes.md reviewer linter --interval 0.5 --stagger 2
# custom prompt template
chorale notes.md author editor --prompt-file my-prompt.txt
chorale --help prints the full surface, the role-spec syntax, the config-file format, and a copy-paste example.
Backends
| Backend | Invocation | Default model |
|---|---|---|
claude |
claude --print -p PROMPT --model MODEL |
claude-sonnet-4-6 |
gemini |
gemini -p PROMPT --model MODEL |
(gemini CLI's own default) |
codex |
codex exec PROMPT --model MODEL |
(codex CLI's own default) |
ollama |
ollama run MODEL (prompt via stdin) |
llama3 |
You only need the binaries you actually use on PATH — a pure-claude run does not need ollama installed and vice versa.
Config file
Optional. Default location: ~/.config/chorale/config.toml (override with --config PATH).
[defaults]
backend = "claude"
model = "claude-sonnet-4-6"
# Override a built-in's default model:
[backends.gemini]
default_model = "gemini-2.5-pro"
# Define a fully custom backend (e.g. a local model server, a research CLI):
[backends.my-local]
command = ["my-tool", "--prompt={prompt}", "--model={model}"]
prompt_via = "argv" # or "stdin" to pipe the prompt instead
default_model = "v1"
timeout = 90.0
A custom backend can then be referenced as role@my-local in any role spec.
Custom prompts
Pass --prompt-file PATH to override the built-in brainstorm prompt. The file is treated as a str.format template with two placeholders the harness fills in per turn:
{role}— the agent's role name (e.g.cook).{file_content}— the current state of the shared file.
Anything an agent emits outside its own ## agent:{role} section is discarded by the splicer, so prompts only need to nudge the agent toward filling its own section sensibly.
Stopping
Ctrl-C on the running process stops all agents cleanly. While running, you can edit the shared file in any editor; agents will see your edits on their next poll. If a conflict happens (you and an agent both edit the same section), chorale idles all agents and waits for you to resolve it (cotype resolve FILE after editing the markers).
How it works
┌─────────────────────────────────────────────────────────────┐
│ user (any editor) ─┐ │
│ │ writes under ## user │
│ ▼ │
│ ┌──── shared.md (cotype-managed) ────┐ │
│ agent_A ───┤ ├─── disk │
│ agent_B ───┤ one section per actor │ │
│ agent_C ───┘ diff3 reconciles concurrent saves│ │
│ │ │ │
│ └────── chorale runtime ─────────┘ │
└─────────────────────────────────────────────────────────────┘
Each agent thread runs an independent loop:
cotype status— idle if a conflict is pending (only the user can resolve).cotype open— capture a fresh base; skip if it hasn't changed since our last save.claude --print -p PROMPT— generate a candidate reply.- Splice: parse the agent's output as Markdown sections, take only the body of
## agent:<role>, splice it into the bytes frombase_path. By construction, no other section's bytes can change. cotype save— submit the spliced bytes; cotype decides direct / merged / noop / conflict.
The structural splice is the key idea: the agent can produce arbitrary content, but only its own section's bytes ever reach the file. Two agents editing two different sections produce edits in disjoint byte ranges, no matter how adjacent the section headers are.
Tests
pip install pytest
pytest -q
Tests cover the splicer's contract (round-trip, role isolation, codefence stripping, no-change short-circuit) and the template generator. The runtime (subprocess wrappers, threading) is intentionally untested — it's almost entirely IO and best validated by running the demo.
Compared to
- cotype — the byte-level safe-save CLI underneath.
choraleis the agent harness;cotypeis the merge engine. headless-agents.sh— the original bash version, still in cotype's repo as a one-screen reference.choraleis the same idea with structure (config, tests, prompt extension point).
License
MIT. See LICENSE.
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file chorale-0.2.1.tar.gz.
File metadata
- Download URL: chorale-0.2.1.tar.gz
- Upload date:
- Size: 40.5 kB
- Tags: Source
- Uploaded using Trusted Publishing? Yes
- Uploaded via: twine/6.1.0 CPython/3.13.12
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
8c156736a71d7ba4622c05613afff1eae7adc068f78b7eb28d9f8ac6b46a62cc
|
|
| MD5 |
f66e8206752a14cd9ba6a5474204159d
|
|
| BLAKE2b-256 |
30508f53aa326775001f9762a91782a19c6e47ba23e8ba6dfa0035f533cf70de
|
Provenance
The following attestation bundles were made for chorale-0.2.1.tar.gz:
Publisher:
publish.yml on yurug/chorale
-
Statement:
-
Statement type:
https://in-toto.io/Statement/v1 -
Predicate type:
https://docs.pypi.org/attestations/publish/v1 -
Subject name:
chorale-0.2.1.tar.gz -
Subject digest:
8c156736a71d7ba4622c05613afff1eae7adc068f78b7eb28d9f8ac6b46a62cc - Sigstore transparency entry: 1448899104
- Sigstore integration time:
-
Permalink:
yurug/chorale@95369b4307bbdf3691dc12987bf878f463d44fef -
Branch / Tag:
refs/tags/v0.2.1 - Owner: https://github.com/yurug
-
Access:
public
-
Token Issuer:
https://token.actions.githubusercontent.com -
Runner Environment:
github-hosted -
Publication workflow:
publish.yml@95369b4307bbdf3691dc12987bf878f463d44fef -
Trigger Event:
push
-
Statement type:
File details
Details for the file chorale-0.2.1-py3-none-any.whl.
File metadata
- Download URL: chorale-0.2.1-py3-none-any.whl
- Upload date:
- Size: 38.4 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? Yes
- Uploaded via: twine/6.1.0 CPython/3.13.12
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
76e148134c2cf70c1bfa2845463bea527cbf2e1347e30ed0e21d8720d9245927
|
|
| MD5 |
6bc462fc90e62d44cb3a6b6446e5d166
|
|
| BLAKE2b-256 |
f4478ac804ffbe84e8a9c9059c4ee0a1b4336bd511def8d5e54e3d28ced4146d
|
Provenance
The following attestation bundles were made for chorale-0.2.1-py3-none-any.whl:
Publisher:
publish.yml on yurug/chorale
-
Statement:
-
Statement type:
https://in-toto.io/Statement/v1 -
Predicate type:
https://docs.pypi.org/attestations/publish/v1 -
Subject name:
chorale-0.2.1-py3-none-any.whl -
Subject digest:
76e148134c2cf70c1bfa2845463bea527cbf2e1347e30ed0e21d8720d9245927 - Sigstore transparency entry: 1448899165
- Sigstore integration time:
-
Permalink:
yurug/chorale@95369b4307bbdf3691dc12987bf878f463d44fef -
Branch / Tag:
refs/tags/v0.2.1 - Owner: https://github.com/yurug
-
Access:
public
-
Token Issuer:
https://token.actions.githubusercontent.com -
Runner Environment:
github-hosted -
Publication workflow:
publish.yml@95369b4307bbdf3691dc12987bf878f463d44fef -
Trigger Event:
push
-
Statement type: