Skip to main content

Run N AI agents that collaborate with you on a single text file, safely. Powered by cotype.

Project description

chorale

PyPI License: MIT

Run N AI agents that brainstorm with you on a single text file, safely. Mix and match Claude, Gemini, Codex, Ollama, and your own.

You write under ## user. Each AI agent owns its own ## agent:<role> section and edits it in place to reply. The file is the conversation — no chat windows, no scrolling transcripts, no lost edits. Concurrent saves are reconciled by cotype's 3-way merge; the harness splices each agent's reply into ONLY its own section's bytes, so two agents editing two different sections cannot conflict by construction.

chorale demo: three Claude personas + a note-taker brainstorming with a user in one shared file

Three Claude personas (cook, logistics, ux-designer) plus a note-taker design a school crêpe stand with the user — all in one brainstorm.md. Same demo applies to the multi-backend mode below; mix in gemini, codex, or ollama per role.

pip install chorale

# all-claude (default)
chorale brainstorm.md cook logistics ux-designer note-taker

# mix four different brains in one chorale
chorale brainstorm.md \
    cook \
    logistics@gemini \
    ux-designer@codex:gpt-5 \
    note-taker@ollama:llama3

Edit brainstorm.md in any editor (with cotype-mode for live updates in Emacs); agents see your saves on their next poll and respond.

Why this exists

Long sessions with AI agents drift into chat transcripts that scroll away from the work you actually want at the end. chorale flips it: the document accumulates in place, every actor has a labelled section, and disagreements between actors surface as inline diff3 markers rather than lost work.

The tool was extracted from cotype's examples/headless-agents.sh — that bash script is still the readable "what's the idea, on one screen" demo; this Python rewrite is the production-friendly version: tested, configurable, extensible.

Install

pip install chorale

Requires Python ≥ 3.11, cotype (auto-installed), and at least one supported AI CLI on PATH (claude, gemini, codex, ollama, or your own — see Backends below).

Usage

chorale FILE ROLE_SPEC [ROLE_SPEC ...] [OPTIONS]

A role spec is one of:

Form Meaning
cook default backend, default model
cook@gemini gemini, gemini's default model
cook@gemini:gemini-2.5-pro gemini, specific model
cook@my-local a backend you defined in the config file

Examples:

# four claude agents on a fresh brainstorm
chorale brainstorm.md cook logistics ux-designer note-taker

# mix brains: each role uses a different CLI
chorale brainstorm.md \
    cook \
    logistics@gemini \
    ux-designer@codex:gpt-5 \
    note-taker@ollama

# override the default backend for the whole run
chorale notes.md reviewer linter --default-backend gemini

# tighter polling, faster turns
chorale notes.md reviewer linter --interval 0.5 --stagger 2

# custom prompt template
chorale notes.md author editor --prompt-file my-prompt.txt

chorale --help prints the full surface, the role-spec syntax, the config-file format, and a copy-paste example.

Backends

Backend Invocation Default model
claude claude --print -p PROMPT --model MODEL claude-sonnet-4-6
gemini gemini -p PROMPT --model MODEL (gemini CLI's own default)
codex codex exec PROMPT --model MODEL (codex CLI's own default)
ollama ollama run MODEL (prompt via stdin) llama3

You only need the binaries you actually use on PATH — a pure-claude run does not need ollama installed and vice versa.

Config file

Optional. Default location: ~/.config/chorale/config.toml (override with --config PATH).

[defaults]
backend = "claude"
model   = "claude-sonnet-4-6"

# Override a built-in's default model:
[backends.gemini]
default_model = "gemini-2.5-pro"

# Define a fully custom backend (e.g. a local model server, a research CLI):
[backends.my-local]
command       = ["my-tool", "--prompt={prompt}", "--model={model}"]
prompt_via    = "argv"        # or "stdin" to pipe the prompt instead
default_model = "v1"
timeout       = 90.0

A custom backend can then be referenced as role@my-local in any role spec.

Custom prompts

Pass --prompt-file PATH to override the built-in brainstorm prompt. The file is treated as a str.format template with two placeholders the harness fills in per turn:

  • {role} — the agent's role name (e.g. cook).
  • {file_content} — the current state of the shared file.

Anything an agent emits outside its own ## agent:{role} section is discarded by the splicer, so prompts only need to nudge the agent toward filling its own section sensibly.

Stopping

Ctrl-C on the running process stops all agents cleanly. While running, you can edit the shared file in any editor; agents will see your edits on their next poll. If a conflict happens (you and an agent both edit the same section), chorale idles all agents and waits for you to resolve it (cotype resolve FILE after editing the markers).

How it works

┌─────────────────────────────────────────────────────────────┐
│  user (any editor) ─┐                                        │
│                     │ writes under ## user                   │
│                     ▼                                        │
│             ┌──── shared.md (cotype-managed) ────┐           │
│  agent_A ───┤                                    ├─── disk   │
│  agent_B ───┤   one section per actor            │           │
│  agent_C ───┘   diff3 reconciles concurrent saves│           │
│                 │                                │           │
│                 └────── chorale runtime ─────────┘           │
└─────────────────────────────────────────────────────────────┘

Each agent thread runs an independent loop:

  1. cotype status — idle if a conflict is pending (only the user can resolve).
  2. cotype open — capture a fresh base; skip if it hasn't changed since our last save.
  3. claude --print -p PROMPT — generate a candidate reply.
  4. Splice: parse the agent's output as Markdown sections, take only the body of ## agent:<role>, splice it into the bytes from base_path. By construction, no other section's bytes can change.
  5. cotype save — submit the spliced bytes; cotype decides direct / merged / noop / conflict.

The structural splice is the key idea: the agent can produce arbitrary content, but only its own section's bytes ever reach the file. Two agents editing two different sections produce edits in disjoint byte ranges, no matter how adjacent the section headers are.

Tests

pip install pytest
pytest -q

Tests cover the splicer's contract (round-trip, role isolation, codefence stripping, no-change short-circuit) and the template generator. The runtime (subprocess wrappers, threading) is intentionally untested — it's almost entirely IO and best validated by running the demo.

Compared to

  • cotype — the byte-level safe-save CLI underneath. chorale is the agent harness; cotype is the merge engine.
  • headless-agents.sh — the original bash version, still in cotype's repo as a one-screen reference. chorale is the same idea with structure (config, tests, prompt extension point).

License

MIT. See LICENSE.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

chorale-0.2.1.tar.gz (40.5 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

chorale-0.2.1-py3-none-any.whl (38.4 kB view details)

Uploaded Python 3

File details

Details for the file chorale-0.2.1.tar.gz.

File metadata

  • Download URL: chorale-0.2.1.tar.gz
  • Upload date:
  • Size: 40.5 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.13.12

File hashes

Hashes for chorale-0.2.1.tar.gz
Algorithm Hash digest
SHA256 8c156736a71d7ba4622c05613afff1eae7adc068f78b7eb28d9f8ac6b46a62cc
MD5 f66e8206752a14cd9ba6a5474204159d
BLAKE2b-256 30508f53aa326775001f9762a91782a19c6e47ba23e8ba6dfa0035f533cf70de

See more details on using hashes here.

Provenance

The following attestation bundles were made for chorale-0.2.1.tar.gz:

Publisher: publish.yml on yurug/chorale

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file chorale-0.2.1-py3-none-any.whl.

File metadata

  • Download URL: chorale-0.2.1-py3-none-any.whl
  • Upload date:
  • Size: 38.4 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.13.12

File hashes

Hashes for chorale-0.2.1-py3-none-any.whl
Algorithm Hash digest
SHA256 76e148134c2cf70c1bfa2845463bea527cbf2e1347e30ed0e21d8720d9245927
MD5 6bc462fc90e62d44cb3a6b6446e5d166
BLAKE2b-256 f4478ac804ffbe84e8a9c9059c4ee0a1b4336bd511def8d5e54e3d28ced4146d

See more details on using hashes here.

Provenance

The following attestation bundles were made for chorale-0.2.1-py3-none-any.whl:

Publisher: publish.yml on yurug/chorale

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page