Declarative agent orchestration for engineering teams
Project description
Orchestrate any AI coding agent. Any model. One command.
Documentation · Getting Started · Glossary · Limitations
Wall of fame
"lol, good luck, keep vibecoding shit that you have no idea about xD" — PeaceFirePL, Reddit
Bernstein takes a goal, breaks it into tasks, assigns them to AI coding agents running in parallel, verifies the output, and merges the results. You come back to working code, passing tests, and a clean git history.
No framework to learn. No vendor lock-in. Agents are interchangeable workers — swap any agent, any model, any provider. The orchestrator itself is deterministic Python code. Zero LLM tokens on scheduling.
pip install bernstein
bernstein -g "Add JWT auth with refresh tokens, tests, and API docs"
Also available via pipx, uv tool install, brew, dnf copr, and npx bernstein-orchestrator. See install options.
Supported agents
Bernstein auto-discovers installed CLI agents. Mix them in the same run — cheap local models for boilerplate, heavy cloud models for architecture.
| Agent | Models | Install |
|---|---|---|
| Claude Code | opus 4.6, sonnet 4.6, haiku 4.5 | npm install -g @anthropic-ai/claude-code |
| Codex CLI | gpt-5.4, o3, o4-mini | npm install -g @openai/codex |
| Gemini CLI | gemini-3-pro, 3-flash | npm install -g @google/gemini-cli |
| Cursor | sonnet 4.6, opus 4.6, gpt-5.4 | Cursor app |
| Aider | Any OpenAI/Anthropic-compatible | pip install aider-chat |
| Ollama + Aider | Local models (offline) | brew install ollama |
| Amp, Cody, Continue.dev, Goose, Kilo, Kiro, OpenCode, Qwen, Roo Code, Tabby | Various | See docs |
| Generic | Any CLI with --prompt |
Built-in |
[!TIP] Run
bernstein --headlessfor CI pipelines — no TUI, structured JSON output, non-zero exit on failure.
Quick start
cd your-project
bernstein init # creates .sdd/ workspace + bernstein.yaml
bernstein -g "Add rate limiting" # agents spawn, work in parallel, verify, exit
bernstein live # watch progress in the TUI dashboard
bernstein stop # graceful shutdown with drain
For multi-stage projects, define a YAML plan:
bernstein run plan.yaml # skips LLM planning, goes straight to execution
bernstein run --dry-run plan.yaml # preview tasks and estimated cost
How it works
- Decompose — the manager breaks your goal into tasks with roles, owned files, and completion signals
- Spawn — agents start in isolated git worktrees, one per task. Main branch stays clean.
- Verify — the janitor checks concrete signals: tests pass, files exist, lint clean, types correct
- Merge — verified work lands in main. Failed tasks get retried or routed to a different model.
The orchestrator is a Python scheduler, not an LLM. Scheduling decisions are deterministic, auditable, and reproducible.
Capabilities
Core orchestration — parallel execution, git worktree isolation, janitor verification, quality gates (lint + types + PII scan), cross-model code review, circuit breaker for misbehaving agents, token growth monitoring with auto-intervention.
Intelligence — contextual bandit router learns optimal model/effort pairs over time. Knowledge graph for codebase impact analysis. Semantic caching saves tokens on repeated patterns. Cost anomaly detection with Z-score flagging.
Enterprise — HMAC-chained tamper-evident audit logs. Policy limits with fail-open defaults and multi-tenant isolation. PII output gating. OAuth 2.0 PKCE. SSO/SAML/OIDC auth. WAL crash recovery — no silent data loss.
Observability — Prometheus /metrics, OTel exporter presets, Grafana dashboards. Per-model cost tracking (bernstein cost). Terminal TUI and web dashboard. Agent process visibility in ps.
Ecosystem — MCP server mode, A2A protocol support, GitHub App integration, pluggy-based plugin system, multi-repo workspaces, cluster mode for distributed execution, self-evolution via --evolve.
Full feature matrix: FEATURE_MATRIX.md
How it compares
| Feature | Bernstein | CrewAI | AutoGen | LangGraph |
|---|---|---|---|---|
| Orchestrator | Deterministic code | LLM-driven | LLM-driven | Graph + LLM |
| Works with | Any CLI agent (18+) | Python SDK classes | Python agents | LangChain nodes |
| Git isolation | Worktrees per agent | No | No | No |
| Verification | Janitor + quality gates | No | No | Conditional edges |
| Cost tracking | Built-in | No | No | No |
| State model | File-based (.sdd/) | In-memory | In-memory | Checkpointer |
| Self-evolution | Built-in | No | No | No |
| Declarative plans (YAML) | Yes | Partial | No | Yes |
| Model routing per task | Yes | No | No | Manual |
| MCP support | Yes | No | No | No |
| Agent-to-agent chat | No | Yes | Yes | No |
| Web UI | No | Yes | Yes | Partial |
| Cloud hosted option | No | Yes | No | Yes |
| Built-in RAG/retrieval | No | Yes | Yes | Yes |
Last verified: 2026-04-07. See full comparison pages for detailed feature matrices.
Monitoring
bernstein live # TUI dashboard
bernstein dashboard # web dashboard
bernstein status # task summary
bernstein ps # running agents
bernstein cost # spend by model/task
bernstein doctor # pre-flight checks
bernstein recap # post-run summary
bernstein trace <ID> # agent decision trace
bernstein explain <cmd> # detailed help with examples
bernstein dry-run # preview tasks without executing
bernstein aliases # show command shortcuts
bernstein config-path # show config file locations
bernstein init-wizard # interactive project setup
Install
| Method | Command |
|---|---|
| pip | pip install bernstein |
| pipx | pipx install bernstein |
| uv | uv tool install bernstein |
| Homebrew | brew tap chernistry/bernstein && brew install bernstein |
| Fedora / RHEL | sudo dnf copr enable alexchernysh/bernstein && sudo dnf install bernstein |
| npm (wrapper) | npx bernstein-orchestrator |
Editor extensions: VS Marketplace · Open VSX
Contributing
PRs welcome. See CONTRIBUTING.md for setup and code style.
Support
If Bernstein saves you time: GitHub Sponsors · Open Collective
License
"To achieve great things, two things are needed: a plan and not quite enough time." — Leonard Bernstein
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file bernstein-1.5.2.tar.gz.
File metadata
- Download URL: bernstein-1.5.2.tar.gz
- Upload date:
- Size: 2.6 MB
- Tags: Source
- Uploaded using Trusted Publishing? Yes
- Uploaded via: twine/6.1.0 CPython/3.13.7
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
a73ee3678419c38bf3b933a3216e315f1ecc5fd6a88366bf14c339d82da82a98
|
|
| MD5 |
47608163b860e9bede00d3a093ca2372
|
|
| BLAKE2b-256 |
5f791009d4c024dceb56590320ff3d5237cbca4d76b72b7f7ce382dc9236ea85
|
Provenance
The following attestation bundles were made for bernstein-1.5.2.tar.gz:
Publisher:
auto-release.yml on chernistry/bernstein
-
Statement:
-
Statement type:
https://in-toto.io/Statement/v1 -
Predicate type:
https://docs.pypi.org/attestations/publish/v1 -
Subject name:
bernstein-1.5.2.tar.gz -
Subject digest:
a73ee3678419c38bf3b933a3216e315f1ecc5fd6a88366bf14c339d82da82a98 - Sigstore transparency entry: 1246147460
- Sigstore integration time:
-
Permalink:
chernistry/bernstein@0f21b29a4a3966c72a76f01bebc3ac90f68bf533 -
Branch / Tag:
refs/heads/main - Owner: https://github.com/chernistry
-
Access:
public
-
Token Issuer:
https://token.actions.githubusercontent.com -
Runner Environment:
github-hosted -
Publication workflow:
auto-release.yml@0f21b29a4a3966c72a76f01bebc3ac90f68bf533 -
Trigger Event:
workflow_run
-
Statement type:
File details
Details for the file bernstein-1.5.2-py3-none-any.whl.
File metadata
- Download URL: bernstein-1.5.2-py3-none-any.whl
- Upload date:
- Size: 2.4 MB
- Tags: Python 3
- Uploaded using Trusted Publishing? Yes
- Uploaded via: twine/6.1.0 CPython/3.13.7
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
5866a5d1a79bae6d0cc82f2d4069b36d68d2024450c88134716fdaeaf7a2793e
|
|
| MD5 |
ddede4cad0940316cb34e706c87dda31
|
|
| BLAKE2b-256 |
2ddc54faecc3fcdbece9d3fa52c7b09e7652cd7fc4ad3f2b87cdbac846ca2fc9
|
Provenance
The following attestation bundles were made for bernstein-1.5.2-py3-none-any.whl:
Publisher:
auto-release.yml on chernistry/bernstein
-
Statement:
-
Statement type:
https://in-toto.io/Statement/v1 -
Predicate type:
https://docs.pypi.org/attestations/publish/v1 -
Subject name:
bernstein-1.5.2-py3-none-any.whl -
Subject digest:
5866a5d1a79bae6d0cc82f2d4069b36d68d2024450c88134716fdaeaf7a2793e - Sigstore transparency entry: 1246147466
- Sigstore integration time:
-
Permalink:
chernistry/bernstein@0f21b29a4a3966c72a76f01bebc3ac90f68bf533 -
Branch / Tag:
refs/heads/main - Owner: https://github.com/chernistry
-
Access:
public
-
Token Issuer:
https://token.actions.githubusercontent.com -
Runner Environment:
github-hosted -
Publication workflow:
auto-release.yml@0f21b29a4a3966c72a76f01bebc3ac90f68bf533 -
Trigger Event:
workflow_run
-
Statement type: