Skip to main content

Adversarial journey framework — orchestrate multiple LLM peers through research, plan, and execute stages with file-trace peer review.

Project description

paircode

Multi-LLM peer review for your code, with file-traces on disk. One primary LLM (alpha) + any number of peer LLMs (Codex, Gemini, Ollama, …) running independent research, plans, and code, with structured cross-review rounds stored entirely as Markdown on disk.

Born from 31 iterations of dual-LLM silent-agreement hunting on a real ML project. See diary/001-step-a-architecture.md for the full design rationale.

Install

pipx install paircode       # or: pip install --user paircode
paircode install            # registers /paircode in every detected LLM CLI

After install, /paircode is available in all three:

CLI File installed
Claude Code ~/.claude/commands/paircode.md
Codex CLI ~/.codex/prompts/paircode.md
Gemini CLI ~/.gemini/commands/paircode.toml

Open any of them and type /paircode. In Gemini, you may need /commands reload the first time.

As of v0.8.0, paircode delegates all CLI invocation to cliworker — one place to own the speed flags, MCP strip tricks, skip-cache, and subscription-first fallback logic. paircode adds the peer-review orchestration on top (file-traces, stages, gates, journey).

Use it — three entry points

1. From Claude Code (or any supported LLM) as a slash command

Inside a Claude Code session:

/paircode drive "build a KISS PHQ-9 depression risk engine"

Claude relays that to the CLI. paircode opens a focus, runs research → plan → execute with peer-reviewed rounds, writes everything to .paircode/ as Markdown.

2. From the shell directly

paircode init                                   # bootstrap .paircode/ in cwd
paircode handshake --write                      # detect CLIs + write peer roster
paircode drive "refactor the auth middleware"   # full loop
paircode status                                 # see where you are

3. Piece by piece

paircode focus "try GitHub Actions migration"
paircode stage research --rounds 2              # cold v1 + one review/revise round
paircode seal research                          # mark research FINAL
paircode stage plan --rounds 3
paircode seal plan
paircode stage execute
paircode seal execute

What ends up on disk

your-project/
  .paircode/
    JOURNEY.md                    # fleet log (auto-updated)
    peers.yaml                    # who's on the team
    peers/
      peer-a-codex/               # peer's profile (and code if full-fork mode)
    focus-01-<slug>/
      FOCUS.md                    # this focus's goal, roster override, gate config
      research/
        alpha-v1.md ... alpha-vN.md
        peer-a-codex-v1.md ...
        reviews/round-01-peer-a-codex-critiques-alpha.md
        alpha-FINAL.md            # sealed exit artifact
        peer-a-codex-FINAL.md
      plan/
        (same shape)
      execute/
        (same shape)
    focus-02-<slug>/
      ...

Every LLM's every thought lands as a Markdown file. That's how heterogeneous LLM tools communicate reliably across vendors, sessions, and days.

Three peer modes

Mode What the peer does When to use
full-fork Writes its own cold codebase + markdown artifacts Silent-agreement hunting, safety-critical code
pair-code Contributes directly to alpha's codebase via patches + reviews Feature work, regular dev
opinion-only Reads alpha's work, writes reviews, never touches code Budget peers, quick sanity checks

Configured per peer in .paircode/peers.yaml.

Model compatibility

CLI Slash command Subprocess driver Status
Claude Code (claude) /paircode via ~/.claude/commands/paircode.md claude -p <prompt> stable
Codex (codex) ✓ context rule via ~/.codex/rules/paircode.rules codex exec <prompt> stable
Gemini CLI (gemini) ✓ reference file at ~/.gemini/paircode.md gemini -p <prompt> stable
Ollama (ollama) — (local models, no slash-cmd primitive) ollama run <model> <prompt> stable
Aider / others best-effort, PRs welcome planned

Commands

paircode --help           full command list
paircode install          register /paircode in all detected LLM CLIs
paircode uninstall        remove /paircode from LLM CLIs (idempotent)
paircode handshake        detect CLIs, propose peer roster
paircode handshake --write save roster to .paircode/peers.yaml
paircode init             bootstrap .paircode/ in cwd
paircode status           summarize current state
paircode focus <name>     open a new focus
paircode focus            list existing focuses
paircode stage <name>     run one stage N rounds on active focus
paircode seal <stage>     seal stage — copy each peer's latest vN to {peer}-FINAL.md
paircode drive <topic>    full loop: research → plan → execute

Why this exists

See diary/001-step-a-architecture.md for the full backstory. The short version: running two LLMs adversarially surfaces silent-agreement bug classes that neither engine alone can catch, because cross-engine agreement is not the same as correctness when both share the same blind spot.

License

MIT. See LICENSE.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

paircode-0.10.1.tar.gz (36.9 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

paircode-0.10.1-py3-none-any.whl (37.0 kB view details)

Uploaded Python 3

File details

Details for the file paircode-0.10.1.tar.gz.

File metadata

  • Download URL: paircode-0.10.1.tar.gz
  • Upload date:
  • Size: 36.9 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.13.7

File hashes

Hashes for paircode-0.10.1.tar.gz
Algorithm Hash digest
SHA256 5726114b8a56ff6f502e9cc7a350b1d76f52d11c8dc3ebe8411293a0d271a113
MD5 48c087094c8844977222114649ee4e4e
BLAKE2b-256 56fa2aaf4ee53fdf3dbb4c704d25025ce8dd6711a03588d20dae99f1c3990037

See more details on using hashes here.

File details

Details for the file paircode-0.10.1-py3-none-any.whl.

File metadata

  • Download URL: paircode-0.10.1-py3-none-any.whl
  • Upload date:
  • Size: 37.0 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.13.7

File hashes

Hashes for paircode-0.10.1-py3-none-any.whl
Algorithm Hash digest
SHA256 b1422c9041e3975b84915471d8f73e3dd2764ef595bd4cdbf162a4cdb2f9e857
MD5 40af3723438164363e2199d5c7867be8
BLAKE2b-256 812227cc5009add38ce7e41f5ade251a0bc70976fc165c423f169aeac9ebd9cc

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page