Skip to main content

Unified MCP server for AI-assisted development — on-demand Obsidian vault access + worker delegation

Project description

hive-vault

CI PyPI Python 3.12+ License: MIT

Unified MCP server for AI-assisted development — on-demand Obsidian vault access + worker delegation to local/cloud models.

The Problem

AI coding assistants load context statically. A typical CLAUDE.md grows to 800+ lines of standards, patterns, and project knowledge. Most of it is irrelevant to the current task. Every session pays the full token cost.

The Solution

Hive replaces static context with on-demand vault queries via the Model Context Protocol. Your knowledge stays in an Obsidian vault. Your AI assistant loads only what it needs, when it needs it.

For tasks that don't need your primary model's full reasoning, Hive delegates to cheaper models (local Ollama or cloud OpenRouter) with automatic routing and budget controls.

Works with any MCP client: Claude Code, Codex CLI, Gemini CLI, Cursor, Windsurf, and others.

Measured result: 67–82% token reduction on targeted queries vs static context loading.

Quick Start

Claude Code:

claude mcp add hive -- uvx hive-vault

Other MCP clients: point your client's MCP config at uvx hive-vault via stdio transport. See your client's docs for the exact syntax.

To configure the vault path (defaults to ~/Projects/knowledge):

# Claude Code example
claude mcp add hive -e VAULT_PATH=/path/to/your/vault -- uvx hive-vault

Tools

Vault Tools (11)

Tool Description
vault_list_projects List all projects in the Obsidian vault
vault_query Read sections or files on demand (supports shortcuts: context, tasks, roadmap, lessons)
vault_search Full-text search across the vault with metadata filters (type, status, tag)
vault_health Health metrics for all vault projects (file counts, staleness, coverage)
vault_update Write to vault with YAML frontmatter validation + auto git commit
vault_create Create new files with auto-generated frontmatter + auto git commit
vault_summarize Smart summarization — returns small files directly, delegates large ones
vault_smart_search Ranked search with relevance scoring (status weight + recency + match density)
session_briefing One-call context briefing: tasks + lessons + git log + health
vault_recent Files changed in the vault in the last N days (git + frontmatter)
vault_usage Tool usage analytics — call counts, token estimates, breakdowns

Worker Tools (3)

Tool Description
delegate_task Route tasks to cheaper models with automatic tier selection
list_models List available models across all providers
worker_status Worker health: budget remaining, connectivity, usage stats

Resources

URI Description
hive://projects All vault projects with file counts and available shortcuts
hive://health Vault health metrics for all projects
hive://projects/{project}/context Project context document (00-context.md)
hive://projects/{project}/tasks Project task backlog (11-tasks.md)
hive://projects/{project}/lessons Project lessons learned (90-lessons.md)

Prompts

Prompt Description
retrospective End-of-session review — extracts lessons and appends to vault
delegate Structured protocol for delegating tasks to cheaper models
vault_sync Post-sprint vault synchronization — reconcile docs with shipped code
benchmark Estimate token savings from hive MCP tools in the current session

Configuration

Variable Default Description
VAULT_PATH ~/Projects/knowledge Path to your Obsidian vault
HIVE_OLLAMA_ENDPOINT http://localhost:11434 Ollama API endpoint
HIVE_OLLAMA_MODEL qwen2.5-coder:7b Default Ollama model
HIVE_OPENROUTER_API_KEY OpenRouter API key (also reads OPENROUTER_API_KEY)
HIVE_OPENROUTER_MODEL qwen/qwen3-coder:free Default OpenRouter model
HIVE_OPENROUTER_BUDGET 5.0 Monthly budget cap in USD
HIVE_DB_PATH ~/.local/share/hive/worker.db SQLite database for budget tracking
HIVE_RELEVANCE_DB_PATH ~/.local/share/hive/relevance.db SQLite database for adaptive context scoring

Architecture

MCP Host (Claude Code, Codex CLI, Cursor, ...)
    └── hive (MCP server, stdio)
            ├── Vault Tools ──── Obsidian vault (~/Projects/knowledge/)
            │     query, search, update, create, summarize,
            │     smart_search, briefing, recent, usage, health
            │
            └── Worker Tools ─── delegate_task → routing:
                  list_models        1. Ollama (local, free)
                  worker_status      2. OpenRouter free tier
                                     3. OpenRouter paid ($5/mo cap)
                                     4. Reject → host handles it

Maximizing Hive with CLAUDE.md

MCP servers don't activate themselves — your AI assistant needs guidance on when and how to use each tool. The CLAUDE.md file (or equivalent in your MCP client) is the key lever.

Add instructions like these to your project's CLAUDE.md:

## Vault & Knowledge (Hive MCP)

When hive-vault MCP is available, use it for on-demand context:
- `vault_query(project="myproject", section="context")` — project overview
- `vault_query(project="myproject", section="tasks")` — active backlog
- `vault_search(query="...")` — cross-vault search
- `session_briefing(project="myproject")` — full context in one call

When writing to the vault: lessons → `90-lessons.md`, decisions → `30-architecture/`.

Without these instructions, your assistant might use Hive, but inconsistently. With them, it uses Hive predictably for every relevant query.

How it works: Your MCP client loads all available tools at session start. The assistant sees tool names and descriptions, but CLAUDE.md instructions tell it which tools to prefer for which situations. Multiple MCP servers coexist — they don't compete. Each serves its domain, and your instructions guide the routing.

Worker Routing

Tasks are routed through a tiered system that minimizes cost:

  1. Ollama (local) — Free. Runs on homelab hardware. Best for trivial tasks.
  2. OpenRouter free — Free tier models (Qwen3 Coder 480B). Real code work.
  3. OpenRouter paid — DeepSeek V3.2 at $0.28/1M tokens. Only when max_cost_per_request > 0 and monthly budget allows.
  4. Reject — Returns error so the host handles the task directly.

Vault Structure

Hive expects an Obsidian vault with this layout:

~/Projects/knowledge/          # vault root (configurable via VAULT_PATH)
├── 00_meta/
│   └── patterns/              # cross-project patterns
├── 10_projects/
│   ├── my-project/
│   │   ├── 00-context.md      # section shortcut: "context"
│   │   ├── 10-roadmap.md      # section shortcut: "roadmap"
│   │   ├── 11-tasks.md        # section shortcut: "tasks"
│   │   ├── 90-lessons.md      # section shortcut: "lessons"
│   │   └── 30-architecture/   # arbitrary paths
│   └── another-project/
└── ...

Development

See CONTRIBUTING.md for setup, code standards, and PR workflow.

git clone https://github.com/mlorentedev/hive.git
cd hive
make install   # create venv + install deps
make check     # lint + typecheck + test

License

MIT

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

hive_vault-1.1.0.tar.gz (112.5 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

hive_vault-1.1.0-py3-none-any.whl (25.9 kB view details)

Uploaded Python 3

File details

Details for the file hive_vault-1.1.0.tar.gz.

File metadata

  • Download URL: hive_vault-1.1.0.tar.gz
  • Upload date:
  • Size: 112.5 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.13.7

File hashes

Hashes for hive_vault-1.1.0.tar.gz
Algorithm Hash digest
SHA256 cd6510d7d0a7bce2730b4fccbf38819047b7260fcd612275dcebe8a8048cd8a2
MD5 310621056828bbf4215d5a7bf8b9cd43
BLAKE2b-256 bfdae6b8fdc8fabd3448d944f7e9fc16d7f5df5ee125903fd70836b576534cda

See more details on using hashes here.

Provenance

The following attestation bundles were made for hive_vault-1.1.0.tar.gz:

Publisher: release.yml on mlorentedev/hive

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file hive_vault-1.1.0-py3-none-any.whl.

File metadata

  • Download URL: hive_vault-1.1.0-py3-none-any.whl
  • Upload date:
  • Size: 25.9 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.13.7

File hashes

Hashes for hive_vault-1.1.0-py3-none-any.whl
Algorithm Hash digest
SHA256 570f70e8c5e322b9f132e3f2e4a8132e5a9d1d90a030b7bcbdcca6bf8ba6805e
MD5 cfb06a267f4a429e156a9ce84ea7d1b6
BLAKE2b-256 043fa46a4b9507c02ab3027383576f3703f2023b11db31ae3fb7057fc4d8214f

See more details on using hashes here.

Provenance

The following attestation bundles were made for hive_vault-1.1.0-py3-none-any.whl:

Publisher: release.yml on mlorentedev/hive

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page