Skip to main content

Multi-agent orchestration framework — LLM-powered agents negotiate, refactor, and evolve your codebase

Project description

GraphBus

Distribute your AI reasoning. Multiple specialized agents negotiate, refactor, and evolve your codebase — separation of concerns at the reasoning level.

License: MIT Python 3.9+ Build Status Version graphbus.com

Website · Quickstart · Examples · CLI Reference · Architecture


What is GraphBus?

GraphBus is a Python framework with a radical idea: let your agents improve the code itself, not just run it.

Every class in a GraphBus project is a potential LLM agent. During a build cycle, agents wake up, read their own source, propose improvements, and negotiate consensus with other agents via a typed message bus. An arbiter resolves conflicts. The result is committed back to source.

At runtime, agents execute their negotiated logic. Each agent owns a focused slice of responsibility — no single context window trying to reason about everything at once.

Negotiate (agents improve the codebase) → Deploy (agents run their domain logic)

Why this matters

Most LLM frameworks use a single context window that grows unbounded — Claude Code, Cursor, Copilot. GraphBus inverts this: specialized agents each reason about their own domain, negotiate at boundaries, and commit improved code. The cognitive load is distributed, not bottlenecked.


Getting Started

GraphBus uses two keys for different purposes:

LLM provider key — powers LLM agent negotiation via LiteLLM. Set the key for your chosen provider:

export DEEPSEEK_API_KEY=...           # default model: deepseek/deepseek-reasoner
export ANTHROPIC_API_KEY=sk-ant-...   # for claude-* models
export OPENAI_API_KEY=sk-...          # for gpt-* models
export OPENROUTER_API_KEY=...         # access all models with one key

GraphBus API key (optional) — warehouses your negotiation history, contracts, and cross-session memory at api.graphbus.com. Without it, negotiation works fine but history isn't persisted.

export GRAPHBUS_API_KEY=gb_...        # sign up at graphbus.com

Quickstart

# Install
pip install graphbus

# Create a new project
graphbus init my-project --template microservices
cd my-project

# Build (static, no LLM)
graphbus build agents/

# Run the built artifacts
graphbus run .graphbus/

# Enable LLM agents for a negotiation round
export DEEPSEEK_API_KEY=your_key_here  # or ANTHROPIC_API_KEY=sk-ant-...
graphbus build agents/ --enable-agents

That's it. Your agents will propose improvements, evaluate each other's proposals, and commit consensus changes. The run step uses zero AI budget.


Hello World

# agents/hello_service.py
from graphbus_core import GraphBusNode, schema_method, subscribe

class HelloService(GraphBusNode):
    SYSTEM_PROMPT = "I generate friendly greeting messages."

    @schema_method(
        input_schema={},
        output_schema={"message": str}
    )
    def generate_message(self):
        return {"message": "Hello from GraphBus!"}

    @subscribe("/Hello/MessageGenerated")
    def on_message(self, event):
        self.log(event.payload)
graphbus build agents/
# [BUILD] Scanning agents/hello_service.py
# [BUILD] Graph: 1 node, 0 edges
# [BUILD] Artifacts written to .graphbus/

graphbus run .graphbus/
# [RUNTIME] Loaded 1 agent
# [RUNTIME] HelloService → "Hello from GraphBus!"

Enable agents and watch them negotiate:

export DEEPSEEK_API_KEY=your_key_here  # or ANTHROPIC_API_KEY=sk-ant-...
graphbus build agents/ --enable-agents
# [AGENT] HelloService: "I propose adding input validation..."
# [AGENT] LoggerService: "I accept — improves contract safety"
# [ARBITER] Consensus reached. Committing changes.
# [BUILD] Artifacts written to .graphbus/ (2 files modified)

Architecture

GraphBus has two strictly separated modes:

┌─────────────────────────────────────────────────────────────────┐
│                        BUILD MODE                               │
│  ┌──────────┐  proposals  ┌─────────┐  evaluations  ┌────────┐ │
│  │ AgentA   │────────────▶│  BUS    │◀──────────────│ AgentB │ │
│  │  (LLM)   │◀────────────│         │───────────────▶│  (LLM) │ │
│  └──────────┘  commits    └────┬────┘               └────────┘ │
│                                │                                │
│                          ┌─────▼─────┐                         │
│                          │  Arbiter  │  resolves conflicts      │
│                          └─────┬─────┘                         │
│                                │                                │
│                    ┌───────────▼──────────┐                     │
│                    │   Build Artifacts     │  (.graphbus/)       │
│                    │  graph.json           │                     │
│                    │  agents.json          │                     │
│                    │  topics.json          │                     │
│                    └───────────────────────┘                     │
└─────────────────────────────────────────────────────────────────┘
                              │
                              ▼ (deploy once)
┌─────────────────────────────────────────────────────────────────┐
│                       RUNTIME MODE                              │
│                                                                 │
│  ┌──────────┐   events    ┌─────────┐   events    ┌──────────┐ │
│  │ AgentA   │────────────▶│  BUS    │────────────▶│ AgentB   │ │
│  │ (static) │             │ (pub/   │             │ (static) │ │
│  └──────────┘             │  sub)   │             └──────────┘ │
│                            └─────────┘                          │
│                                                                 │
│          ✅ No LLM calls   ✅ Deterministic   ✅ $0 AI cost     │
└─────────────────────────────────────────────────────────────────┘

Core Concepts

Concept Description
GraphBusNode Base class for all agents. Subclass it, add a SYSTEM_PROMPT, decorate methods.
@schema_method Declares typed input/output schema for a method — forms the contract between agents.
@subscribe Registers a handler for a topic on the message bus.
@depends_on Declares a dependency edge between agents in the DAG.
Build Artifacts JSON files emitted after a build: graph.json, agents.json, topics.json.
Arbiter A special agent (mark with IS_ARBITER = True) that resolves conflicting proposals.
Message Bus Typed pub/sub backbone. Topics are typed paths (e.g. /Order/Created).

CLI Reference

GraphBus ships a full-featured CLI with 18 commands:

graphbus [OPTIONS] COMMAND [ARGS]...

Core Commands

Command Description
graphbus build <path> Scan agents, build dependency graph, emit artifacts
graphbus run <artifacts> Load artifacts and execute the runtime
graphbus inspect <artifacts> Inspect build artifacts (graph, agents, topics)
graphbus validate <path> Validate agent definitions without building
graphbus tui Launch interactive TUI (keyboard-driven UI)

Development Tools

Command Description
graphbus init <name> Initialize a new project from template
graphbus generate agent <Name> Generate agent boilerplate
graphbus profile <artifacts> Profile runtime performance
graphbus dashboard Launch web-based visualization dashboard
graphbus negotiate <path> Run a standalone LLM negotiation round
graphbus inspect-negotiation Browse negotiation history

Deployment Tools

Command Description
graphbus docker build Generate Dockerfile for your project
graphbus docker run Build and run in Docker
graphbus k8s generate Generate Kubernetes manifests
graphbus k8s deploy Deploy to Kubernetes cluster
graphbus ci github Generate GitHub Actions workflow
graphbus ci gitlab Generate GitLab CI pipeline

Advanced

Command Description
graphbus state Manage agent state persistence
graphbus coherence Run inter-agent coherence checks
graphbus contract Validate schema contracts between agents
graphbus migrate Migrate artifacts across schema versions

Examples

Three working examples are included in examples/:

1. hello_graphbus — The basics

cd examples/hello_graphbus
python build.py              # Build without agents
DEEPSEEK_API_KEY=your_key python build.py   # Build with LLM agents
python run.py                # Run the built artifacts

2. hello_world_mcp — MCP integration

GraphBus ships an MCP (Model Context Protocol) server so any MCP-compatible client can interact with a running GraphBus runtime as a tool.

cd examples/hello_world_mcp
graphbus build agents/
graphbus run .graphbus/ --mcp   # Exposes MCP endpoint

3. news_summarizer — Real-world pipeline

A multi-agent news summarization pipeline. One agent fetches, one summarizes, one formats. Each agent owns its domain. Agents negotiate a shared schema for the summary output and run their specialized logic at runtime.

cd examples/news_summarizer
graphbus build agents/
OPENAI_API_KEY=sk-... graphbus run .graphbus/

The Negotiation Protocol

When --enable-agents is set, each agent gets an LLM instance. Build Mode runs this cycle:

1. SCAN     → Discover all GraphBusNode subclasses in the target path
2. EXTRACT  → Parse methods, schemas, subscriptions, system prompts
3. BUILD    → Construct networkx DAG (topological sort for eval order)
4. ACTIVATE → Instantiate one LLM agent per node
5. PROPOSE  → Each agent reads its source and proposes improvements
6. EVALUATE → Agents evaluate each other's proposals (accept/reject + reasoning)
7. ARBITRATE → Arbiter resolves split decisions
8. COMMIT   → Accepted proposals are applied to source files
9. ARTIFACT → Build graph + agent metadata serialized to .graphbus/

Proposals are structured messages:

class Proposal:
    agent_id: str
    target_file: str
    diff: str          # unified diff
    rationale: str     # LLM reasoning
    affects: list[str] # other agents impacted

Project Structure

graphbus-core/
├── graphbus_core/           # Core library
│   ├── node_base.py         # GraphBusNode base class
│   ├── decorators.py        # @schema_method, @subscribe, @depends_on
│   ├── config.py            # BuildConfig, RuntimeConfig
│   ├── build/               # Build pipeline (scanner, extractor, builder, writer)
│   ├── runtime/             # Runtime engine (loader, bus, router, executor)
│   ├── agents/              # LLM agent wrappers
│   └── model/               # Pydantic models (Message, Event, Proposal, ...)
├── graphbus_cli/            # CLI (click + rich)
│   ├── main.py              # Entry point
│   ├── commands/            # One file per command group
│   └── repl/                # Interactive REPL
├── graphbus_api/            # REST API server
├── graphbus-mcp-server/     # MCP protocol server
├── examples/
│   ├── hello_graphbus/      # Basic example
│   ├── hello_world_mcp/     # MCP integration
│   └── news_summarizer/     # Real-world pipeline
├── tests/                   # Full test suite
└── docs/
    └── core/                # Architecture docs

vs. LangGraph / CrewAI / AutoGen

GraphBus LangGraph CrewAI AutoGen
Agents rewrite source code ✅ Core feature ⚠️ Limited
Zero LLM cost at runtime ✅ Always ❌ Every call ❌ Every call ❌ Every call
Agent negotiation / consensus ✅ Built-in ⚠️ Partial ⚠️ Partial
Graph-native DAG orchestration ✅ networkx
Typed schema contracts per edge ⚠️ Partial
Build / Runtime mode separation ✅ Core design
Full deployment tooling (K8s/Docker) ✅ CLI native
Interactive TUI

The key difference: other frameworks run agents to perform tasks. GraphBus runs agents to improve the code that performs tasks. After a build cycle, the intelligence is baked into static artifacts — not perpetually consumed at runtime.


Installation

From source (current)

git clone https://github.com/graphbus/graphbus-core
cd graphbus-core
pip install -e .

From PyPI (coming soon)

pip install graphbus

Requirements

  • Python 3.9+
  • networkx >= 3.0
  • click >= 8.1.0
  • rich >= 13.0.0

Optional (for LLM agents):

  • litellm — all LLM providers (Anthropic, OpenAI, DeepSeek, OpenRouter, etc.)

Optional (for TUI):

  • textual >= 0.47.0

Testing

# Run all tests
pytest

# With coverage
pytest --cov=graphbus_core --cov-report=term-missing

# Run a specific test suite
pytest tests/test_runtime/
pytest tests/test_build/

Test coverage:

  • Build pipeline: 100% passing (scanner, extractor, graph builder, artifact writer)
  • Runtime engine: 100% passing (loader, message bus, event router, executor)
  • End-to-end: Hello World example builds and runs clean
  • CLI: All commands smoke-tested

Contributing

GraphBus is in alpha and we welcome contributors. See CONTRIBUTING.md for guidelines.

Quick start for contributors:

git clone https://github.com/graphbus/graphbus-core
cd graphbus-core
pip install -e ".[dev]"
pytest                    # Make sure everything passes

Areas where we especially want help:

  • More LLM backends — LiteLLM integration supports many providers; help us test them
  • More examples — real-world pipelines showing agent negotiation
  • Documentation — architecture docs, tutorials, protocol spec
  • Benchmarks — latency/cost comparisons vs. runtime LLM frameworks

Roadmap

See ROADMAP.md for the full roadmap with targets and status.

What's shipped (v0.1 alpha):

  • Build Mode (scanner → extractor → graph builder → artifact writer)
  • Runtime Mode (loader → message bus → event router → executor)
  • CLI with 18 commands
  • LLM negotiation engine (propose / evaluate / arbitrate / commit)
  • MCP server integration
  • Docker + Kubernetes deployment tooling
  • 800+ tests, CI with GitHub Actions

Coming next (v0.2):

  • graphbus dev — hot-reload mode during development
  • Message trace UI — replay message flows in a web UI
  • graphbus test — agent unit tests with full runtime wired in

Later:

  • PyPI release (pip install graphbus)
  • Multi-provider LLM support (via LiteLLM)
  • Ollama local LLM backend
  • Multi-process distributed runtime
  • TypeScript SDK
  • Protocol specification (for non-Python implementations)

Want to influence what ships next? Open a GitHub Discussion or 👍 the relevant issue.


License

MIT. See LICENSE.


Links

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

graphbus-0.5.0.tar.gz (239.4 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

graphbus-0.5.0-py3-none-any.whl (289.9 kB view details)

Uploaded Python 3

File details

Details for the file graphbus-0.5.0.tar.gz.

File metadata

  • Download URL: graphbus-0.5.0.tar.gz
  • Upload date:
  • Size: 239.4 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.12.3

File hashes

Hashes for graphbus-0.5.0.tar.gz
Algorithm Hash digest
SHA256 e6b8bcd331f95c021da45b7153009301ed009e6d1e2b9ddb4b818aecd00db904
MD5 0fb56d5a9e4953767887b558a32c642a
BLAKE2b-256 9beba62d8f140ea153809b6f6f5e653940afdc75f18105e8eb12812fe56783c4

See more details on using hashes here.

File details

Details for the file graphbus-0.5.0-py3-none-any.whl.

File metadata

  • Download URL: graphbus-0.5.0-py3-none-any.whl
  • Upload date:
  • Size: 289.9 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.12.3

File hashes

Hashes for graphbus-0.5.0-py3-none-any.whl
Algorithm Hash digest
SHA256 3d67d4ac4d6cd703ac11696409613b2a1ea1fa8fe72124f1a54129fa90ac0729
MD5 cd350e510dae6c59857ad058fd7c49da
BLAKE2b-256 458869905045f9a9dc5c661ad7b29077e9e6ec485ec97dd13493be39a9972e43

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page