Skip to main content

MCP server that routes Q&A across all your projects — locally or over SSH. Part of the FleetQ ecosystem.

Project description

Harbormaster

MCP server that routes Q&A across all your projects — locally or over SSH. Part of the FleetQ ecosystem.

PyPI License: MIT Status

What it does

You work across many projects, each with its own CLAUDE.md and Serena memories. Switching cwd loses context. Harbormaster lets one Claude Code session ask any project a question without changing directory — the project's subagent loads its own memory, answers, and returns a summary.

Optional SSH fan-out lets the same tools target remote VPS hosts. Optional FleetQ adapter makes Harbormaster a first-class citizen of the FleetQ Bridge ecosystem (Platform Tool, A2A Agent Cards, federated knowledge graph).

Tools

Tool Purpose Cost
list_projects(host=None) Enumerate configured projects (local) or remote dir listing (SSH). ~50 ms / ~1 s
list_hosts() Configured [hosts] + ~/.ssh/config Host aliases. ~5 ms
project_status(name, host=None) Git log, Serena memories, log tails. ~200 ms / ~2 s
ask_project(name, question, max_turns=5, host=None) Spawn claude -p in project cwd, return ≤ 800-word summary. ~30 s / ~90 s
delegate_task(name, task, deliverable, allow_writes=False, host=None) Read-only delegation; v1 fails closed for writes. ~60 s / ~90 s
fan_out_ask(question, project_filter=None, host_filter=None, max_concurrency=5, max_turns=3) Parallel multi-project Q&A. Returns one section per target. ~max_turns × claude_p_time × ⌈targets/max_concurrency⌉

More tools (recall_qa, …) land in v1.1–1.2. See docs/architecture-harbormaster.md.

Install

pipx install harbormaster-mcp
# or run without install:
uvx harbormaster-mcp

Register in Claude Code:

claude mcp add --scope user harbormaster harbormaster-mcp

Or in Claude Desktop (~/Library/Application Support/Claude/claude_desktop_config.json):

{
  "mcpServers": {
    "harbormaster": {
      "command": "/opt/homebrew/bin/harbormaster-mcp",
      "env": {}
    }
  }
}

Live UI (optional)

Install with the [ui] extra and run the dashboard alongside (or instead of) the MCP server:

pipx install 'harbormaster-mcp[ui]'
harbormaster-ui --port 7531
# open http://127.0.0.1:7531/

v1.0.0a4 ships:

  • Dashboard at / — project grid with framework / git / Serena / CLAUDE.md badges (HTMX + Alpine + Tailwind via CDN, ~no build step).
  • GET /api/projects — JSON list of every project Harbormaster discovers (use this to script your own dashboards).
  • GET /api/health{"status":"ok","version":"..."} for liveness probes.

The UI is a separate process from the MCP server. Run both — they read the same TOML config so projects discovered by one are visible to the other. SSE feed of live MCP queries lands in v1.0.0a5.

HTTP / SSE transport

For remote MCP clients or running outside the desktop client, Harbormaster can speak SSE / streamable-http instead of stdio. A bearer token is required — there is no auth-disabled HTTP mode.

export HARBORMASTER_MCP_TOKEN=$(python -c 'import secrets; print(secrets.token_urlsafe(32))')
harbormaster-mcp --transport sse --host 127.0.0.1 --port 7532
# or the new MCP spec transport:
harbormaster-mcp --transport streamable-http --port 7532

Clients send the token as Authorization: Bearer <token>. Missing or wrong tokens return 401.

Override the env-var name with --auth-token-env MY_VAR if you keep secrets under a different name. Use --host 0.0.0.0 only if you understand the implications — the bearer token is the only thing between the open port and your projects.

Run harbormaster-mcp --help for the full flag set.

Configure

Zero-config by default — Harbormaster discovers projects under ~/htdocs/* if it exists. For any other layout, drop a TOML file at ~/.config/harbormaster/config.toml:

[projects]
glob = ["~/code/*", "~/work/*"]
exclude = ["**/node_modules/**", "**/vendor/**"]

[hosts.friday]
ssh_host = "katsarov-server.local"
remote_htdocs = "~/htdocs"

[hosts.hetzner-1]
ssh_host = "hetzner-1.example.com"
remote_htdocs = "/var/www"

A per-project override at ./.harbormaster.toml in your cwd takes precedence over the user-level config.

Full schema and all options: docs/architecture-harbormaster.md §3.

Remote hosts

Every project-targeting tool accepts an optional host parameter. With host set, Harbormaster runs the equivalent command on that SSH host:

> ask_project(name="pricex", question="quick health check?", host="friday")
[ssh friday bash -lc 'cd ~/htdocs/pricex && claude -p ...']

Pre-flight on each remote host:

  1. Install Claude Code: npm i -g @anthropic-ai/claude-code.
  2. Authenticate once: claude (this is a separate Anthropic seat per host).
  3. Ensure project paths exist with their CLAUDE.md / .serena/ in place.
  4. Confirm passwordless SSH from your machine (BatchMode=yes is enforced).

Streaming

POST /mcp/{server} accepts Accept: text/event-stream for incremental output. Long-running tools (ask_project, delegate_task, fan_out_ask, all 30–90s) emit heartbeat events on the wire while they run, then a final result event with the same MCP envelope JSON-mode would return. ask_project against a local project additionally emits per-token chunk events as claude -p --output-format stream-json produces them.

Direct curl example (bypasses FleetQ — for testing or a custom consumer):

curl -N -X POST http://127.0.0.1:7531/mcp/harbormaster \
  -H 'Accept: text/event-stream' \
  -H 'Content-Type: application/json' \
  -d '{"method":"tools/call","params":{"name":"ask_project","arguments":{"name":"alpha","question":"summarize"}}}'

Through the FleetQ Bridge, set stream: true in the request body — the Bridge forwards text/event-stream bytes verbatim with X-Accel-Buffering: no so reverse proxies don't buffer.

JSON mode (no Accept: text/event-stream, no stream flag) is unchanged — fully backward-compatible.

v1 limits

  • Read-only delegation (allow_writes=True returns an error).
  • 60 s local / 90 s remote subprocess timeout.
  • 800-word output cap (full output dumped to /tmp/harbormaster-*.md on truncation).
  • Remote list_projects returns a flat list of directory names (rich metadata is local-only — gathering it remotely would mean N round-trips).
  • Per-token chunk events are local-only — ask_project over SSH still falls back to heartbeat + final result (remote stdout demux is a separate refactor).

Status

v1.0.0a13 — SSH streaming + v1.1 first deliverable shipped 2026-05-09. ask_project over SSH now emits chunk events for every assistant text delta (matches local; ssh -T -q keeps banners off stdout, non-JSON noise filtered silently). Validation tightened so unknown-project errors are deterministic 400 instead of "maybe-400-maybe-502 depending on iteration timing." On the FleetQ side, Harbormaster is now seeded as a popular MCP stdio tool — fresh installs surface it under /tools (disabled by default like all seeded tools). The 6-week roadmap to general availability:

Phase Weeks Focus
v1.0 1–2 Local + SSH + Live UI scaffold + PyPI alpha
v1.1 3–4 FleetQ Bridge / Platform Tool / A2A integration
v1.2 5–6 Q&A history, federated KG, auto project graph

See docs/design-harbormaster.md for the full design.

Lineage

Harbormaster v1.0 grew out of project-router-mcp v0.1 (2026-05-08). v0.1 git history is preserved on this repository — the v0.1 single-file server lived at src/server.py and remains in commits prior to the v1.0 scaffolding refactor.

Architecture

Single Python process hosting an MCP server (stdio + HTTP/SSE), an embedded Live UI, and an optional FleetQ adapter. Pluggable backend per host (default: claude -p). All shell-bound strings pass through shlex.quote.

Detailed component diagrams, transport choices, and integration contract: docs/architecture-harbormaster.md.

FleetQ Bridge integration (optional)

Install with the [fleetq] extra and Harbormaster can register itself as a Bridge daemon in your FleetQ deployment, advertising its 6 MCP tools to the platform:

pipx install 'harbormaster-mcp[fleetq]'

In your config TOML:

[fleetq]
enabled = true
register_as_bridge = true
base_url = "https://app.fleetq.net"   # or your self-hosted FleetQ URL
api_token_env = "FLEETQ_API_TOKEN"    # env var holding the Sanctum token
heartbeat_interval = 30               # seconds between heartbeats

Then export your Sanctum token (must have a team:<uuid> ability) and run the MCP server:

export FLEETQ_API_TOKEN=...
harbormaster-mcp

Harbormaster shows up in your FleetQ Connections UI as harbormaster on <hostname>. v1.0.0a6 ships register + heartbeat + disconnect; the reverse-WebSocket relay channel for incoming MCP tool calls lands in v1.0.0a7+.

Discovered contract reference: docs/fleetq-bridge-contract.md.

Releasing

PyPI publishing is automated via Trusted Publishing (OIDC) — no API tokens in the repo. Tag-pushes to v* trigger .github/workflows/publish.yml. Setup steps and the release checklist live in docs/publishing.md.

License

MIT — see LICENSE.

Project details


Release history Release notifications | RSS feed

Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

harbormaster_mcp-1.0.0a13.tar.gz (252.4 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

harbormaster_mcp-1.0.0a13-py3-none-any.whl (55.7 kB view details)

Uploaded Python 3

File details

Details for the file harbormaster_mcp-1.0.0a13.tar.gz.

File metadata

  • Download URL: harbormaster_mcp-1.0.0a13.tar.gz
  • Upload date:
  • Size: 252.4 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.13.12

File hashes

Hashes for harbormaster_mcp-1.0.0a13.tar.gz
Algorithm Hash digest
SHA256 4d3183561357d6983abeef74f0cdf23c39943a02a0956d63e48dae42472534da
MD5 640e41f9a746efc5ee10b567d189d15a
BLAKE2b-256 12f50c4ba59ba0bf7b01ece9c55228892946f16a6d91b46af318d1b2c72e3904

See more details on using hashes here.

Provenance

The following attestation bundles were made for harbormaster_mcp-1.0.0a13.tar.gz:

Publisher: publish.yml on FleetQ/harbormaster

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file harbormaster_mcp-1.0.0a13-py3-none-any.whl.

File metadata

File hashes

Hashes for harbormaster_mcp-1.0.0a13-py3-none-any.whl
Algorithm Hash digest
SHA256 3863f3751bd86945cf8154d2bcd55ea2d7497ce073fcf40a36e2178d0b006c4a
MD5 5692a1bfbcf911303378d2a20612b2b2
BLAKE2b-256 06c1b9426770da8a21bedee15207e06d3d3158cab1411a7c5aabb22d520bd094

See more details on using hashes here.

Provenance

The following attestation bundles were made for harbormaster_mcp-1.0.0a13-py3-none-any.whl:

Publisher: publish.yml on FleetQ/harbormaster

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page