Skip to main content

Shared memory layer for AI agents — MCP server and Python library

Project description

Mesh

Shared memory for AI agents.

When one agent learns something, every other agent can recall it — by meaning, not exact keywords.

The problem

Every AI agent starts each session with amnesia. Claude in your terminal doesn't know what Claude in your editor discovered. Your coding agent can't access what your review agent found. Every agent repeats work already done.

The solution

Mesh is a shared semantic memory layer. Agents write memories. Agents read memories. They never need to communicate directly.

from mesh import Mesh

# Agent A learns something
agent_a = Mesh(namespace="my-project", agent_id="coding-agent")
agent_a.learn("The API rate limit is 100 requests/min")

# Agent B recalls it — no direct communication
agent_b = Mesh(namespace="my-project", agent_id="review-agent")
results = agent_b.recall("what are the API constraints?")
# → Returns the memory above with 0.91 similarity

Installation

pip install mesh-memory

MCP Server (Claude Desktop + Cursor)

Add Mesh to your AI tools with one config change.

Claude Desktop — edit ~/Library/Application Support/Claude/claude_desktop_config.json:

{
  "mcpServers": {
    "mesh": {
      "command": "mesh-server"
    }
  }
}

Cursor — create .cursor/mcp.json in your project:

{
  "mcpServers": {
    "mesh": {
      "command": "mesh-server"
    }
  }
}

Restart the app. Your agents now have access to mesh_learn, mesh_recall, mesh_forget, and mesh_inspect.

Namespaces

Agents sharing a namespace share memory. Agents in different namespaces are isolated.

# Team A's agents
Mesh(namespace="team-alpha")

# Team B's agents — cannot see Team A's memories
Mesh(namespace="team-beta")

# Set namespace for the MCP server via env var
# MESH_NAMESPACE=my-project mesh-server

Memory types

Type When to use
fact Objective information
preference User or system preferences
context Situational background
result Outcomes of tasks
instruction Rules and guidelines

Storage

Memory lives in ~/.mesh/ on your machine. It persists across sessions and reboots. Nothing leaves your machine.

Features

v0.3 — Privacy, Portability, and Namespace Control

Privacy flags Mark any memory as local-only to ensure it never leaves your machine:

mesh.learn(
    "Internal staging server is under heavy load",
    local_only=True
)

Export and import

# Export all non-private memories from a namespace
mesh-export backup.json --namespace shared

# Import into a new machine or namespace
mesh-import backup.json --namespace shared

# Preview without writing
mesh-import backup.json --dry-run

Namespace management

# List all namespaces
mesh-namespaces list

# Get detailed stats
mesh-namespaces stats work

# Delete a namespace
mesh-namespaces delete old-project

# Rename a namespace
mesh-namespaces rename work work-archived

Using Mesh from any tool

Start the HTTP server:

mesh-http

Then from any tool that supports HTTP:

Shell:

# Store a memory
curl -s -X POST http://localhost:7701/learn \
  -H "Content-Type: application/json" \
  -d '{"content": "deploy uses ./scripts/deploy.sh", "memory_type": "process"}'

# Recall memories
curl -s -X POST http://localhost:7701/recall \
  -H "Content-Type: application/json" \
  -d '{"query": "how do we deploy?", "count": 3}'

# Get formatted context for a system prompt
curl -s -X POST http://localhost:7701/context \
  -H "Content-Type: application/json" \
  -d '{"count": 10, "format": "markdown"}'

Python (no SDK needed):

import httpx

# Recall
r = httpx.post("http://localhost:7701/recall", json={"query": "deploy process"})
memories = r.json()["results"]

Node.js:

const res = await fetch("http://localhost:7701/recall", {
  method: "POST",
  headers: { "Content-Type": "application/json" },
  body: JSON.stringify({ query: "deploy process", count: 5 })
});
const { results } = await res.json();

License

MIT

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

mesh_context_layer-0.4.2.tar.gz (79.3 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

mesh_context_layer-0.4.2-py3-none-any.whl (54.5 kB view details)

Uploaded Python 3

File details

Details for the file mesh_context_layer-0.4.2.tar.gz.

File metadata

  • Download URL: mesh_context_layer-0.4.2.tar.gz
  • Upload date:
  • Size: 79.3 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.14.0

File hashes

Hashes for mesh_context_layer-0.4.2.tar.gz
Algorithm Hash digest
SHA256 d578eb990925fc3feffbf65bc7e53f86a46986e205eb0c7abf147f6677798d8b
MD5 c9fcf63640bc46a253ea3eafeec69da5
BLAKE2b-256 513b1be69300a5989d06b0f05c1c4d5b4af5dc3b386996b563a2231aebce74df

See more details on using hashes here.

File details

Details for the file mesh_context_layer-0.4.2-py3-none-any.whl.

File metadata

File hashes

Hashes for mesh_context_layer-0.4.2-py3-none-any.whl
Algorithm Hash digest
SHA256 17fca2bfb6a6b4211ae7bb68cd53cd820a48da02d77b69977615a5de175555fb
MD5 c6d8ec785b4508da781b41ca57fa56a2
BLAKE2b-256 9d7e3cf26be55f5f3f40b096118f791bc8fcd40dec4205dfa484bc354f59c0b6

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page