Shared intelligence memory plane for AI agents — remote MCP + CLI + Claude Code / Codex plugins.
Project description
maestro-nerve
Shared intelligence memory plane for AI agents.
One workspace-scoped, evidence-grounded memory. Reachable from Claude Code, Claude Desktop, Codex, ChatGPT, or your own MCP client.
maestro-nerve is a persistent, workspace-scoped memory substrate for AI agents. Instead of letting every agent (Claude Code / Claude Desktop / Codex / ChatGPT / ...) keep its own private memory silo, Nerve gives all of them one shared, evidence-grounded memory plane reachable over MCP.
The seven V1 tools are workspace-bound, read-first, and grounded in source evidence:
| Tool | What it does |
|---|---|
search_workspace |
Ranked semantic search across entities, claims, and evidence. |
list_entities |
Paginated entity browser. |
inspect |
Aliases + claims + evidence for one entity. |
profile |
Narrative briefing synthesized from grounded memory. |
ground |
Evidence records for one claim. |
discover |
Time-axis feed of what changed. |
act.draft |
Draft an action (email, reply, task) from workspace memory — never executes. |
Plus server_info — a free health probe.
Install (pick one)
You need a Maestro API key first — mint one at https://nerve.maestro.onl/access.
Option A — pip + remote MCP (recommended)
pip install maestro-nerve
mnerve login # paste your API key
mnerve access --client claude-code --format mcp-json
# Copy the printed JSON into Claude Code MCP settings. Done.
Supported clients today: claude-code, codex, chatgpt.
Formats: envelope (default, scripting-friendly), mcp-json, settings-json, env.
Option B — Claude Code plugin
/plugin marketplace add maestro-ai-stack/maestro-nerve
/plugin install maestro-nerve@maestro-nerve
/reload-plugins
Plugin brings the skill (agent guidance) and the CLI install hint. You still need pip install maestro-nerve && mnerve login for the CLI.
Option C — Codex plugin
codex plugin add maestro-ai-stack/maestro-nerve
Then pip install maestro-nerve && mnerve login && mnerve access --client codex.
Option D — Skill only (no CLI)
npx skills add maestro-ai-stack/maestro-nerve -y -g
Best for agents that only need the skill guidance and already have an API key in their environment.
CLI surface
mnerve login # paste / prompt for an API key
mnerve logout # remove local credentials
mnerve whoami # backend confirms identity + primary workspace
mnerve workspace list # list workspaces reachable with this key
mnerve status # one-line reachability probe
mnerve access --client <c> --format <f> # print MCP config for host c, shape f
mnerve version # installed package version
mnerve access --format values:
--format |
Shape printed on stdout |
|---|---|
envelope (default) |
{client, transport, workspace, endpoint, config, note} — good for scripts |
mcp-json |
Bare {mcpServers: {...}} — paste into Claude Desktop / Claude Code |
settings-json |
Same as mcp-json; merge under mcpServers in your settings.json |
env |
KEY=value lines; source into the shell that launches the host |
Hints are printed to stderr so stdout stays parseable.
Authentication model
- API key lives at
~/.config/maestro/auth.json(platform-appropriate), mode0600. - Override path with
MAESTRO_AUTH_FILE=/path/to/auth.json. - The key is a bearer token for
https://nerve-api.maestro.onl/api/*. Workspace isolation is enforced server-side; the key cannot reach workspaces it does not own. mnerve logoutremoves the local file but does not revoke the key server-side. Revoke at https://nerve.maestro.onl/access.
Remote MCP endpoint
POST https://nerve-api.maestro.onl/api/mcp?workspace=<your-slug>
Authorization: Bearer <your api key>
Content-Type: application/json
Every hosted MCP request is workspace-scoped at the transport. You cannot leak between workspaces with one key.
Development / internal backend
The backend (pipeline, extraction, Brain UI) is closed source and lives in a private sibling repo. This public repo is the client-only distribution surface. Issues, feature requests, plugin changes: file here. Backend-specific concerns: file here too — we triage and route internally.
License
MIT — see LICENSE.
Built by Maestro — Singapore AI product studio.
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file maestro_nerve-0.1.0.tar.gz.
File metadata
- Download URL: maestro_nerve-0.1.0.tar.gz
- Upload date:
- Size: 15.2 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.2.0 CPython/3.11.14
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
e3ea014a8242b6315368fb10a31ca1a445d0fcb5f8d517505291a29670c752f6
|
|
| MD5 |
58beb917b7e4ccaf0f071033d864e2e4
|
|
| BLAKE2b-256 |
2ca1d92c9f8ef56ab9fe3e285ede93b75eed840fe8706a6234e387d4bb37a2d7
|
File details
Details for the file maestro_nerve-0.1.0-py3-none-any.whl.
File metadata
- Download URL: maestro_nerve-0.1.0-py3-none-any.whl
- Upload date:
- Size: 16.6 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.2.0 CPython/3.11.14
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
902e443c3e29f9615f1388a2c2d9da883ee489bdcc2521776afd05117a110023
|
|
| MD5 |
ba5e944d86067fe5b7e12633ac68df68
|
|
| BLAKE2b-256 |
3c3c3ae422f3c3912ea56a02e2842559606c77a772db606a202693591bc0a5a2
|