Skip to main content

MCP server that routes Q&A across all your projects — locally or over SSH. Part of the FleetQ ecosystem.

Project description

Harbormaster

MCP server that routes Q&A across all your projects — locally or over SSH. Part of the FleetQ ecosystem.

PyPI License: MIT Status

What it does

You work across many projects, each with its own CLAUDE.md and Serena memories. Switching cwd loses context. Harbormaster lets one Claude Code session ask any project a question without changing directory — the project's subagent loads its own memory, answers, and returns a summary.

Optional SSH fan-out lets the same tools target remote VPS hosts. Optional FleetQ adapter makes Harbormaster a first-class citizen of the FleetQ Bridge ecosystem (Platform Tool, A2A Agent Cards, federated knowledge graph).

Tools

Tool Purpose Cost
list_projects(host=None) Enumerate configured projects (local) or remote dir listing (SSH). ~50 ms / ~1 s
list_hosts() Configured [hosts] + ~/.ssh/config Host aliases. ~5 ms
project_status(name, host=None) Git log, Serena memories, log tails. ~200 ms / ~2 s
ask_project(name, question, max_turns=5, host=None) Spawn claude -p in project cwd, return ≤ 800-word summary. ~30 s / ~90 s
delegate_task(name, task, deliverable, allow_writes=False, host=None) Read-only delegation; v1 fails closed for writes. ~60 s / ~90 s
fan_out_ask(question, project_filter=None, host_filter=None, max_concurrency=5, max_turns=3) Parallel multi-project Q&A. Returns one section per target. ~max_turns × claude_p_time × ⌈targets/max_concurrency⌉

More tools (recall_qa, …) land in v1.1–1.2. See docs/architecture-harbormaster.md.

Install

pipx install harbormaster-mcp
# or run without install:
uvx harbormaster-mcp

Register in Claude Code:

claude mcp add --scope user harbormaster harbormaster-mcp

Or in Claude Desktop (~/Library/Application Support/Claude/claude_desktop_config.json):

{
  "mcpServers": {
    "harbormaster": {
      "command": "/opt/homebrew/bin/harbormaster-mcp",
      "env": {}
    }
  }
}

Live UI (optional)

Install with the [ui] extra and run the dashboard alongside (or instead of) the MCP server:

pipx install 'harbormaster-mcp[ui]'
harbormaster-ui --port 7531
# open http://127.0.0.1:7531/

v1.0.0a4 ships:

  • Dashboard at / — project grid with framework / git / Serena / CLAUDE.md badges (HTMX + Alpine + Tailwind via CDN, ~no build step).
  • GET /api/projects — JSON list of every project Harbormaster discovers (use this to script your own dashboards).
  • GET /api/health{"status":"ok","version":"..."} for liveness probes.

The UI is a separate process from the MCP server. Run both — they read the same TOML config so projects discovered by one are visible to the other. SSE feed of live MCP queries lands in v1.0.0a5.

HTTP / SSE transport

For remote MCP clients or running outside the desktop client, Harbormaster can speak SSE / streamable-http instead of stdio. A bearer token is required — there is no auth-disabled HTTP mode.

export HARBORMASTER_MCP_TOKEN=$(python -c 'import secrets; print(secrets.token_urlsafe(32))')
harbormaster-mcp --transport sse --host 127.0.0.1 --port 7532
# or the new MCP spec transport:
harbormaster-mcp --transport streamable-http --port 7532

Clients send the token as Authorization: Bearer <token>. Missing or wrong tokens return 401.

Override the env-var name with --auth-token-env MY_VAR if you keep secrets under a different name. Use --host 0.0.0.0 only if you understand the implications — the bearer token is the only thing between the open port and your projects.

Run harbormaster-mcp --help for the full flag set.

Configure

Zero-config by default — Harbormaster discovers projects under ~/htdocs/* if it exists. For any other layout, drop a TOML file at ~/.config/harbormaster/config.toml:

[projects]
glob = ["~/code/*", "~/work/*"]
exclude = ["**/node_modules/**", "**/vendor/**"]

[hosts.friday]
ssh_host = "katsarov-server.local"
remote_htdocs = "~/htdocs"

[hosts.hetzner-1]
ssh_host = "hetzner-1.example.com"
remote_htdocs = "/var/www"

A per-project override at ./.harbormaster.toml in your cwd takes precedence over the user-level config.

Full schema and all options: docs/architecture-harbormaster.md §3.

Remote hosts

Every project-targeting tool accepts an optional host parameter. With host set, Harbormaster runs the equivalent command on that SSH host:

> ask_project(name="pricex", question="quick health check?", host="friday")
[ssh friday bash -lc 'cd ~/htdocs/pricex && claude -p ...']

Pre-flight on each remote host:

  1. Install Claude Code: npm i -g @anthropic-ai/claude-code.
  2. Authenticate once: claude (this is a separate Anthropic seat per host).
  3. Ensure project paths exist with their CLAUDE.md / .serena/ in place.
  4. Confirm passwordless SSH from your machine (BatchMode=yes is enforced).

v1 limits

  • Read-only delegation (allow_writes=True returns an error).
  • 60 s local / 90 s remote subprocess timeout.
  • 800-word output cap (full output dumped to /tmp/harbormaster-*.md on truncation).
  • Remote list_projects returns a flat list of directory names (rich metadata is local-only — gathering it remotely would mean N round-trips).

Status

v1.0.0a10 — Polish + foundations sprint shipped 2026-05-09. SSE streaming on /mcp/{server} (heartbeats keep long ask_project / delegate_task / fan_out_ask calls alive through reverse-proxy timeouts), update_endpoints config-watch loop (manifest drift gets pushed automatically without a process restart), gated live FleetQ Bridge CI smoke, and 4xx pass-through on the agent-fleet side so daemon errors no longer get masked as generic 502s. v1.0.0a9 was the first published release on PyPI; v1.0.0a10 builds on it. The 6-week roadmap to general availability:

Phase Weeks Focus
v1.0 1–2 Local + SSH + Live UI scaffold + PyPI alpha
v1.1 3–4 FleetQ Bridge / Platform Tool / A2A integration
v1.2 5–6 Q&A history, federated KG, auto project graph

See docs/design-harbormaster.md for the full design.

Lineage

Harbormaster v1.0 grew out of project-router-mcp v0.1 (2026-05-08). v0.1 git history is preserved on this repository — the v0.1 single-file server lived at src/server.py and remains in commits prior to the v1.0 scaffolding refactor.

Architecture

Single Python process hosting an MCP server (stdio + HTTP/SSE), an embedded Live UI, and an optional FleetQ adapter. Pluggable backend per host (default: claude -p). All shell-bound strings pass through shlex.quote.

Detailed component diagrams, transport choices, and integration contract: docs/architecture-harbormaster.md.

FleetQ Bridge integration (optional)

Install with the [fleetq] extra and Harbormaster can register itself as a Bridge daemon in your FleetQ deployment, advertising its 6 MCP tools to the platform:

pipx install 'harbormaster-mcp[fleetq]'

In your config TOML:

[fleetq]
enabled = true
register_as_bridge = true
base_url = "https://app.fleetq.net"   # or your self-hosted FleetQ URL
api_token_env = "FLEETQ_API_TOKEN"    # env var holding the Sanctum token
heartbeat_interval = 30               # seconds between heartbeats

Then export your Sanctum token (must have a team:<uuid> ability) and run the MCP server:

export FLEETQ_API_TOKEN=...
harbormaster-mcp

Harbormaster shows up in your FleetQ Connections UI as harbormaster on <hostname>. v1.0.0a6 ships register + heartbeat + disconnect; the reverse-WebSocket relay channel for incoming MCP tool calls lands in v1.0.0a7+.

Discovered contract reference: docs/fleetq-bridge-contract.md.

Releasing

PyPI publishing is automated via Trusted Publishing (OIDC) — no API tokens in the repo. Tag-pushes to v* trigger .github/workflows/publish.yml. Setup steps and the release checklist live in docs/publishing.md.

License

MIT — see LICENSE.

Project details


Release history Release notifications | RSS feed

Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

harbormaster_mcp-1.0.0a10.tar.gz (234.2 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

harbormaster_mcp-1.0.0a10-py3-none-any.whl (50.7 kB view details)

Uploaded Python 3

File details

Details for the file harbormaster_mcp-1.0.0a10.tar.gz.

File metadata

  • Download URL: harbormaster_mcp-1.0.0a10.tar.gz
  • Upload date:
  • Size: 234.2 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.13.12

File hashes

Hashes for harbormaster_mcp-1.0.0a10.tar.gz
Algorithm Hash digest
SHA256 604b80e91717ac64af5267e98e0c9b0120ddc6f7a60e6ea8f884397efa410768
MD5 bb51b4d147251fa927ff63ecf38fb8b3
BLAKE2b-256 796b9d1144c34a264704e5dd858d612d3fb9182bf3659ec2696fb557e186a306

See more details on using hashes here.

Provenance

The following attestation bundles were made for harbormaster_mcp-1.0.0a10.tar.gz:

Publisher: publish.yml on FleetQ/harbormaster

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file harbormaster_mcp-1.0.0a10-py3-none-any.whl.

File metadata

File hashes

Hashes for harbormaster_mcp-1.0.0a10-py3-none-any.whl
Algorithm Hash digest
SHA256 047edce3859bfea59ad97d95b87afb72314caef1126f8709d14eb29ba0f9261d
MD5 32d2ff093c4c4fac1c53b630cec8d709
BLAKE2b-256 a62d3bb20492c891a7ced099d46bdad4e7d454427ebf039eb28bdde38264bc2b

See more details on using hashes here.

Provenance

The following attestation bundles were made for harbormaster_mcp-1.0.0a10-py3-none-any.whl:

Publisher: publish.yml on FleetQ/harbormaster

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page