Skip to main content

Declarative agent orchestration for engineering teams

Project description

Bernstein

Declarative agent orchestration for engineering teams.

One YAML. Multiple coding agents. Ship while you sleep.

Bernstein TUI — live task dashboard

Web dashboard — real-time task monitoring, cost tracking, agent status

Bernstein Web Dashboard

CI codecov PyPI npm VS Marketplace Open VSX COPR Python 3.12+ License Benchmark MCP Compatible A2A Compatible Sponsor

Homepage | Documentation | Getting Started | Known Limitations


If you're running one agent at a time, you're leaving performance on the table. Bernstein takes a goal, breaks it into tasks, assigns them to AI coding agents running in parallel, verifies the output, and commits the results. You come back to working code, passing tests, and a clean git history.

No framework to learn. No vendor lock-in. Works with Claude Code, Codex, Gemini CLI, Cursor, Aider, Amp, Roo Code, Goose, Qwen, and any CLI tool that accepts a prompt flag.

Think of it as what Kubernetes did for containers, but for AI coding agents. You declare a goal. The control plane decomposes it into tasks. Short-lived agents execute them in isolated git worktrees -- like pods. A janitor verifies the output before anything lands.

Architecture
pip install bernstein                    # any platform
# or
pipx install bernstein                   # isolated install
# or
uv tool install bernstein                # fastest (Rust-based)
# or
brew tap chernistry/bernstein && brew install bernstein  # macOS / Linux
# or
sudo dnf copr enable alexchernysh/bernstein && sudo dnf install bernstein  # Fedora / RHEL
# or
npx bernstein-orchestrator               # npm wrapper (requires Python 3.12+)

# Run:
bernstein -g "Add JWT auth with refresh tokens, tests, and API docs"

1.78× faster than single-agent execution, verified on internal benchmarks. See benchmarks for methodology and reproduction steps.

What it is

Bernstein is a deterministic orchestrator for CLI coding agents. It schedules tasks in parallel across any installed agent — Claude Code, Codex, Cursor, Gemini, Aider, and more — with git worktree isolation, janitor-verified output, and file-based state you can inspect, back up, and recover from. No vendor lock-in. No framework to learn. Your agents, your models, your backlog.

5-minute setup

# 1. Install (pick one — full list in the install block above)
pipx install bernstein

# 2. Init your project (creates .sdd/ workspace + bernstein.yaml)
cd your-project
bernstein init

# 3. Run — pass a goal inline or let bernstein.yaml guide the run
bernstein -g "Add rate limiting and improve test coverage"

That's it. Your agents spawn, work in parallel, verify their output, and exit. Watch progress in the terminal dashboard.

Supported agents

Bernstein ships with adapters for 12 CLI agents. If you have any of these installed, Bernstein uses them — no API key plumbing required:

Agent Models Install
Aider Any OpenAI/Anthropic-compatible model pip install aider-chat
Amp opus 4.6, gpt-5.4 brew install amp
Claude Code opus 4.6, sonnet 4.6, haiku 4.5 npm install -g @anthropic-ai/claude-code
Codex CLI gpt-5.4, o3, o4-mini npm install -g @openai/codex
Cursor sonnet 4.6, opus 4.6, gpt-5.4 Cursor app (sign in via app)
Gemini CLI gemini-3-pro, 3-flash npm install -g @google/gemini-cli
Goose Any provider Install Goose CLI
Kilo Configurable npm install -g kilo
Kiro Multi-provider Install Kiro CLI
OpenCode Multi-provider Install OpenCode CLI
Qwen qwen3-coder, qwen-max npm install -g qwen-code
Roo Code opus 4.6, sonnet 4.6, gpt-4o VS Code extension (headless CLI)

Prefer a different agent? Bring your own -- the generic adapter accepts any CLI tool with a --prompt-flag interface. Mix models in the same run: cheap free-tier agents for boilerplate, heavy models for architecture.

[!TIP] Run bernstein --headless for CI pipelines -- no TUI, structured JSON output, non-zero exit on failure.

Shipped features

Only capabilities that ship with v1.4.11. Full matrix at FEATURE_MATRIX.md.

  • Deterministic scheduling — zero LLM tokens on coordination. The orchestrator is plain Python.
  • Parallel execution — spawn multiple agents across roles (backend, qa, docs, security) simultaneously.
  • Git worktree isolation — every agent works in its own branch. Your main branch stays clean.
  • Janitor verification — concrete signals (tests pass, files exist, no regressions) before anything lands.
  • Quality gates — lint, type-check, PII scan, and mutation testing run automatically after completion.
  • Plan files — multi-stage YAML with stages and steps, like Ansible playbooks (bernstein run plan.yaml).
  • Cost tracking — per-model spend, tokens, and duration (bernstein cost).
  • Live dashboards — terminal TUI (bernstein live) and browser UI (bernstein dashboard).
  • Self-evolution — analyze metrics, propose improvements, sandbox-test, and auto-apply what passes (--evolve).
  • CI autofix — parse failing CI logs, create fix tasks, route to the right agent (bernstein ci fix <url>).
  • Circuit breaker — halt agents that repeatedly violate purpose or crash.
  • Token growth monitor — detect runaway token consumption and intervene automatically.
  • Cross-model verification — route completed task diffs to a different model for review.
  • Audit trail — HMAC-chained tamper-evident logs with Merkle seal verification.
  • Pluggy plugin system — hook into any lifecycle event.
  • Multi-repo workspaces — orchestrate across multiple git repositories as one workspace.
  • Cluster mode — central server + remote worker nodes for distributed execution.
  • MCP server mode — run Bernstein as an MCP tool server for other agents.
  • 12 agent adapters — Claude, Codex, Cursor, Gemini, Aider, Amp, Roo Code, Kiro, Kilo, OpenCode, Qwen, Goose, plus a generic catch-all.

Install

All methods install the same bernstein CLI.

Method Command
pip pip install bernstein
pipx pipx install bernstein
uv uv tool install bernstein
Homebrew brew tap chernistry/bernstein && brew install bernstein
Fedora / RHEL sudo dnf copr enable alexchernysh/bernstein && sudo dnf install bernstein
npm (thin wrapper) npx bernstein-orchestrator or npm i -g bernstein-orchestrator

The npm wrapper requires Python 3.12+ on the system -- it delegates to pipx/uvx/python under the hood.

COPR targets: Fedora 41, 42 (x86_64, aarch64), EPEL 9, 10.

Editor extensions

Editor Install
VS Code code --install-extension alex-chernysh.bernstein or search "Bernstein" in Extensions
Cursor Search "Bernstein" in Extensions, or install from Open VSX
Cursor (skills) 8 built-in skills in packages/cursor-plugin/

Monitoring and diagnostics

bernstein live          # interactive TUI dashboard (3 columns)
bernstein dashboard     # open web dashboard in browser
bernstein status        # task summary and agent health
bernstein ps            # running agent processes
bernstein cost          # spend breakdown by model and task
bernstein doctor        # pre-flight: adapters, API keys, ports
bernstein recap         # post-run: tasks, pass/fail, cost
bernstein retro         # detailed retrospective report
bernstein trace <ID>    # step-by-step agent decision trace
bernstein logs -f       # tail live agent output

Agents appear in Activity Monitor / ps as bernstein: <role> [<session>] — no more hunting for mystery Python processes.

Plan files

For multi-stage projects, define stages and steps in a YAML plan file:

bernstein run plan.yaml

The plan skips manager decomposition and goes straight to execution. See templates/plan.yaml for the format and examples/plans/flask-api.yaml for a working example.

Observability

Prometheus metrics at /metrics — wire up Grafana, set alerts, monitor cost. OTLP telemetry initialization supports distributed tracing.

Extensibility

Pluggy-based plugin system. Hook into any lifecycle event:

from bernstein.plugins import hookimpl

class SlackNotifier:
    @hookimpl
    def on_task_completed(self, task_id, role, result_summary):
        slack.post(f"#{role} finished {task_id}: {result_summary}")

GitHub App integration

Install a GitHub App on your repository to automatically convert GitHub events into Bernstein tasks. Issues become backlog items, PR review comments become fix tasks, and pushes trigger QA verification.

bernstein github setup       # print setup instructions
bernstein github test-webhook  # verify configuration

Agent catalogs

Hire specialist agents from Agency (100+ agents) or define your own:

# bernstein.yaml
catalogs:
  - name: agency
    type: agency
    enabled: true

The spawner matches the best agent for each role using keyword-based role inference and affinity scoring.

Watch: terminal demo (GIF) Bernstein terminal demo

How it compares

Bernstein CrewAI AutoGen LangGraph Ruflo
Orchestrator type Deterministic code LLM-driven LLM-driven Graph + LLM LLM-driven
Agent model Any CLI agent Python classes Python agents Nodes + edges Claude only
Parallel execution Native Sequential Async Graph-based Sequential
Git isolation Worktrees None None None Branches
Verification Janitor + quality gates None built-in None built-in Conditional edges Self-check
Cost tracking Built-in Manual Manual Manual Built-in
State persistence File-based (.sdd/) In-memory In-memory Checkpointer Cloud
Self-evolution Built-in No No No Yes
Plan files YAML stages + steps Python code Python code Python code No
Agent catalogs Yes (Agency + custom) No No No No

Full comparison pages -- detailed feature matrices, benchmark data, and "when to use X instead" guides for Conductor, Crystal, Stoneforge, GitHub Agent HQ, and single-agent workflows.

Comparisons

Origin

Built during a 47-hour sprint: 12 AI agents on a single laptop, 737 tickets closed (15.7/hour), 826 commits. Full write-up. Every design decision here is a direct response to those findings.

Roadmap

Bernstein's roadmap is public. Near-term work focuses on adoption and the governance moat; longer-term work on enterprise standards and distribution.

Shipped

Area What Status
Governance Lifecycle governance kernel — guarded state transitions, typed events Done
Governance Governed workflow mode — deterministic phases, hashable definitions Done
Governance Model routing policy — provider allow/deny lists Done
Governance Immutable HMAC-chained audit log — tamper-evident, daily rotation Done
Governance Execution WAL — hash-chained write-ahead log, crash recovery, determinism fingerprinting Done
Adoption CI autofix pipeline — bernstein ci fix <url> and bernstein ci watch Done
Adoption Comparative benchmark suite — orchestrated vs. single-agent proof Done
Adoption Agent run manifest — hashable workflow spec for SOC2 evidence Done
Adoption bernstein demo — zero-config first-run experience Done
Adoption bernstein doctor — pre-flight health check Done

Now (P1)

Area What Target
Enterprise SSO/SAML/OIDC auth for multi-tenant deployments H2 2026
Governance Time-based model policy constraints ("deny expensive providers during peak hours") H2 2026
Adoption Verified SWE-Bench eval publication In progress

Next (P2)

Area What Target
Enterprise Dynamic policy hot-reload without restart 2026
Adoption JetBrains IDE extension 2026
Governance Task-specific model constraints ("role=security must use opus-only") 2026

Support Bernstein

Bernstein is free and open-source. If it saves you time, consider sponsoring:

All sponsorship proceeds fund development, infrastructure, and open-source sustainability.

Contributing

PRs welcome. See CONTRIBUTING.md for setup, testing, and code style. Open an issue for bugs and feature requests.

License

Apache License 2.0


Don't babysit agents. Set a goal, walk away, come back to working code.

What will your agents build first?

Project details


Release history Release notifications | RSS feed

Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

bernstein-1.4.13.tar.gz (1.9 MB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

bernstein-1.4.13-py3-none-any.whl (1.8 MB view details)

Uploaded Python 3

File details

Details for the file bernstein-1.4.13.tar.gz.

File metadata

  • Download URL: bernstein-1.4.13.tar.gz
  • Upload date:
  • Size: 1.9 MB
  • Tags: Source
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.13.7

File hashes

Hashes for bernstein-1.4.13.tar.gz
Algorithm Hash digest
SHA256 f430cb9140eb7a5d1dc86ba8b7f6197a4c12eeb7ce4e9ab98861072e893a4c14
MD5 130205643d1393b2b4db5e8aa5a78b5c
BLAKE2b-256 8001fba49279b363afe5831339b3bf7543aa1a281aac7bcf1edbb46cd11f051a

See more details on using hashes here.

Provenance

The following attestation bundles were made for bernstein-1.4.13.tar.gz:

Publisher: auto-release.yml on chernistry/bernstein

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file bernstein-1.4.13-py3-none-any.whl.

File metadata

  • Download URL: bernstein-1.4.13-py3-none-any.whl
  • Upload date:
  • Size: 1.8 MB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.13.7

File hashes

Hashes for bernstein-1.4.13-py3-none-any.whl
Algorithm Hash digest
SHA256 2751171f7eb758a3f8e4004e26caf24e25a0187ca016ce087d8218ae0f6da881
MD5 7d607a7c25d7b0b4026c0cd87fa1c19b
BLAKE2b-256 17aae61567f45558b2b7230d757cfe238cf6c38163f26bbdfe8215140016b168

See more details on using hashes here.

Provenance

The following attestation bundles were made for bernstein-1.4.13-py3-none-any.whl:

Publisher: auto-release.yml on chernistry/bernstein

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page