Skip to main content

Run N AI agents that collaborate with you on a single text file, safely. Powered by cotype.

Project description

chorale

PyPI License: MIT

Run N AI agents that brainstorm with you on a single text file, safely. Mix and match Claude, Gemini, Codex, Ollama, and your own.

You write under ## user. Each AI agent owns its own ## agent:<role> section and edits it in place to reply. The file is the conversation — no chat windows, no scrolling transcripts, no lost edits. Concurrent saves are reconciled by cotype's 3-way merge; the harness splices each agent's reply into ONLY its own section's bytes, so two agents editing two different sections cannot conflict by construction.

pip install chorale

# all-claude (default)
chorale brainstorm.md cook logistics ux-designer note-taker

# mix four different brains in one chorale
chorale brainstorm.md \
    cook \
    logistics@gemini \
    ux-designer@codex:gpt-5 \
    note-taker@ollama:llama3

Edit brainstorm.md in any editor (with cotype-mode for live updates in Emacs); agents see your saves on their next poll and respond.

Why this exists

Long sessions with AI agents drift into chat transcripts that scroll away from the work you actually want at the end. chorale flips it: the document accumulates in place, every actor has a labelled section, and disagreements between actors surface as inline diff3 markers rather than lost work.

The tool was extracted from cotype's examples/headless-agents.sh — that bash script is still the readable "what's the idea, on one screen" demo; this Python rewrite is the production-friendly version: tested, configurable, extensible.

Install

pip install chorale

Requires Python ≥ 3.11, cotype (auto-installed), and at least one supported AI CLI on PATH (claude, gemini, codex, ollama, or your own — see Backends below).

Usage

chorale FILE ROLE_SPEC [ROLE_SPEC ...] [OPTIONS]

A role spec is one of:

Form Meaning
cook default backend, default model
cook@gemini gemini, gemini's default model
cook@gemini:gemini-2.5-pro gemini, specific model
cook@my-local a backend you defined in the config file

Examples:

# four claude agents on a fresh brainstorm
chorale brainstorm.md cook logistics ux-designer note-taker

# mix brains: each role uses a different CLI
chorale brainstorm.md \
    cook \
    logistics@gemini \
    ux-designer@codex:gpt-5 \
    note-taker@ollama

# override the default backend for the whole run
chorale notes.md reviewer linter --default-backend gemini

# tighter polling, faster turns
chorale notes.md reviewer linter --interval 0.5 --stagger 2

# custom prompt template
chorale notes.md author editor --prompt-file my-prompt.txt

chorale --help prints the full surface, the role-spec syntax, the config-file format, and a copy-paste example.

Backends

Backend Invocation Default model
claude claude --print -p PROMPT --model MODEL claude-sonnet-4-6
gemini gemini -p PROMPT --model MODEL (gemini CLI's own default)
codex codex exec PROMPT --model MODEL (codex CLI's own default)
ollama ollama run MODEL (prompt via stdin) llama3

You only need the binaries you actually use on PATH — a pure-claude run does not need ollama installed and vice versa.

Config file

Optional. Default location: ~/.config/chorale/config.toml (override with --config PATH).

[defaults]
backend = "claude"
model   = "claude-sonnet-4-6"

# Override a built-in's default model:
[backends.gemini]
default_model = "gemini-2.5-pro"

# Define a fully custom backend (e.g. a local model server, a research CLI):
[backends.my-local]
command       = ["my-tool", "--prompt={prompt}", "--model={model}"]
prompt_via    = "argv"        # or "stdin" to pipe the prompt instead
default_model = "v1"
timeout       = 90.0

A custom backend can then be referenced as role@my-local in any role spec.

Custom prompts

Pass --prompt-file PATH to override the built-in brainstorm prompt. The file is treated as a str.format template with two placeholders the harness fills in per turn:

  • {role} — the agent's role name (e.g. cook).
  • {file_content} — the current state of the shared file.

Anything an agent emits outside its own ## agent:{role} section is discarded by the splicer, so prompts only need to nudge the agent toward filling its own section sensibly.

Stopping

Ctrl-C on the running process stops all agents cleanly. While running, you can edit the shared file in any editor; agents will see your edits on their next poll. If a conflict happens (you and an agent both edit the same section), chorale idles all agents and waits for you to resolve it (cotype resolve FILE after editing the markers).

How it works

┌─────────────────────────────────────────────────────────────┐
│  user (any editor) ─┐                                        │
│                     │ writes under ## user                   │
│                     ▼                                        │
│             ┌──── shared.md (cotype-managed) ────┐           │
│  agent_A ───┤                                    ├─── disk   │
│  agent_B ───┤   one section per actor            │           │
│  agent_C ───┘   diff3 reconciles concurrent saves│           │
│                 │                                │           │
│                 └────── chorale runtime ─────────┘           │
└─────────────────────────────────────────────────────────────┘

Each agent thread runs an independent loop:

  1. cotype status — idle if a conflict is pending (only the user can resolve).
  2. cotype open — capture a fresh base; skip if it hasn't changed since our last save.
  3. claude --print -p PROMPT — generate a candidate reply.
  4. Splice: parse the agent's output as Markdown sections, take only the body of ## agent:<role>, splice it into the bytes from base_path. By construction, no other section's bytes can change.
  5. cotype save — submit the spliced bytes; cotype decides direct / merged / noop / conflict.

The structural splice is the key idea: the agent can produce arbitrary content, but only its own section's bytes ever reach the file. Two agents editing two different sections produce edits in disjoint byte ranges, no matter how adjacent the section headers are.

Tests

pip install pytest
pytest -q

Tests cover the splicer's contract (round-trip, role isolation, codefence stripping, no-change short-circuit) and the template generator. The runtime (subprocess wrappers, threading) is intentionally untested — it's almost entirely IO and best validated by running the demo.

Compared to

  • cotype — the byte-level safe-save CLI underneath. chorale is the agent harness; cotype is the merge engine.
  • headless-agents.sh — the original bash version, still in cotype's repo as a one-screen reference. chorale is the same idea with structure (config, tests, prompt extension point).

License

MIT. See LICENSE.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

chorale-0.2.0.tar.gz (23.5 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

chorale-0.2.0-py3-none-any.whl (19.8 kB view details)

Uploaded Python 3

File details

Details for the file chorale-0.2.0.tar.gz.

File metadata

  • Download URL: chorale-0.2.0.tar.gz
  • Upload date:
  • Size: 23.5 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.13.12

File hashes

Hashes for chorale-0.2.0.tar.gz
Algorithm Hash digest
SHA256 07c82d1283291d827bbb804ade69973e81ad36984c82938def22179cc1f4e2bb
MD5 429803b52150b0a69695fddb37f4f5e4
BLAKE2b-256 db7d64938208e2ecd9db0a501f8ac37b444084b359f140bce67859aa07414da3

See more details on using hashes here.

Provenance

The following attestation bundles were made for chorale-0.2.0.tar.gz:

Publisher: publish.yml on yurug/chorale

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file chorale-0.2.0-py3-none-any.whl.

File metadata

  • Download URL: chorale-0.2.0-py3-none-any.whl
  • Upload date:
  • Size: 19.8 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.13.12

File hashes

Hashes for chorale-0.2.0-py3-none-any.whl
Algorithm Hash digest
SHA256 e9ed49297cf4fe5261c97133fc0821e9e4f876aed070fb272269df84aa5938cc
MD5 cbb0f863ec7b6ee4e5f8efa554cc6362
BLAKE2b-256 82bba640e824f319c3f757ef9c256cc3fe7a5cb911955d30f2719d89ec5e36aa

See more details on using hashes here.

Provenance

The following attestation bundles were made for chorale-0.2.0-py3-none-any.whl:

Publisher: publish.yml on yurug/chorale

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page