Skip to main content

Zero-Knowledge Memory Layer for AI Agents — Cognitive Security Protocol with 4 proprietary seals. Giving Agents a Past. Giving Models a Soul.

Project description

Synapse Layer Logo

Zero-Knowledge Memory Layer for AI Agents

Persistent. Encrypted. 1-line integration. 🧠

CI PyPI MCP Registry Smithery Docs License Tests Coverage

Website · Forge · Docs · PyPI · Smithery · MCP Registry


⚡ Quick Start (< 60 seconds)

Option A: MCP (Zero-Code — Claude, Cursor, Windsurf)

Add to your MCP config and restart. No API keys. No code. Done.

{
  "mcpServers": {
    "synapse-layer": {
      "url": "https://forge.synapselayer.org/api/mcp"
    }
  }
}
📁 Config file locations
Client Path
Claude Desktop (macOS) ~/Library/Application Support/Claude/claude_desktop_config.json
Claude Desktop (Windows) %APPDATA%\Claude\claude_desktop_config.json
Cursor .cursor/mcp.json (project) or ~/.cursor/mcp.json (global)
Windsurf ~/.codeium/windsurf/mcp_config.json

5 tools available instantly:

Tool Description
save_to_synapse Store memory with full Cognitive Security pipeline
recall Retrieve memories ranked by Trust Quotient™
search Cross-agent memory search with full-text matching
process_text Auto-detect decisions, milestones, and alerts
health_check System health, version, and capability report

Option B: Python SDK

pip install synapse-layer
from synapse_memory import SynapseMemory, SqliteBackend

memory = SynapseMemory(
    agent_id="my-agent",
    backend=SqliteBackend(),  # persistent, zero-config
)

# Store (full security pipeline runs automatically)
result = await memory.store("User prefers dark mode", confidence=0.95)
print(result.trust_quotient)   # 0.89
print(result.sanitized)        # True (PII auto-redacted)
print(result.privacy_applied)  # True (DP noise injected)

# Recall with ranking
recalls = await memory.recall("user preferences")
for r in recalls:
    print(f"{r.content} (TQ: {r.trust_quotient:.2f})")

Option C: @remember Decorator (1-Line)

from synapse_memory import SynapseMemory, remember

memory = SynapseMemory(agent_id="my-agent")

@remember(memory)
async def answer(prompt: str) -> str:
    return llm.chat(prompt)  # auto recall + store

🧠 Why Synapse Layer?

AI agents forget everything between sessions. They lose context when switching models. They reprocess the same information every call.

Synapse Layer is the missing memory primitive.

Without Memory With Synapse Layer
Session state Resets every turn Persistent across sessions
Token usage Reprocesses context Up to 70% reduction via recall
Model switching Context lost Signed handover (GPT-4 ↔ Claude)
Privacy Plaintext embeddings AES-256-GCM + PII redaction + DP noise
Recall quality Non-deterministic Ranked by Trust Quotient™

🛡️ Security Architecture

Every memory passes through a non-bypassable 4-layer Cognitive Security Pipeline:

Agent → Sanitize (PII) → Validate Intent → Encrypt (AES-256-GCM) → DP Noise → Vault
Layer Name What It Does
1 Semantic Privacy Guard™ 15+ regex patterns for PII, secrets, credentials
2 Intelligent Intent Validation™ Two-step categorization with self-healing on recall
3 AES-256-GCM Encryption Authenticated encryption with PBKDF2 key derivation
4 Differential Privacy Calibrated Gaussian noise on embeddings

🔑 MCP Permissions

Permission Required? Justification
file_system Local only Used exclusively by SqliteBackend for local .synapse/memories.db persistence. Remote mode (forge.synapselayer.org/api/mcp) does not use the filesystem — all data is stored in PostgreSQL.
network Required for remote MCP endpoint communication.

Note: If you use the remote MCP endpoint (recommended), no filesystem access is needed. The file_system permission only applies when running the SDK locally with SqliteBackend.


🔌 Framework Integrations

Native adapters — install the extra and import:

Framework Install Import Status
LangChain pip install synapse-layer[langchain] from synapse_memory.integrations import SynapseChatMessageHistory ✅ Stable
CrewAI pip install synapse-layer[crewai] from synapse_memory.integrations.crewai_memory import SynapseCrewStorage ✅ Stable
AutoGen pip install synapse-layer[autogen] from synapse_memory.integrations import SynapseAutoGenMemory ✅ Stable
LlamaIndex pip install synapse-layer[llamaindex] from synapse_memory.integrations.llamaindex import SynapseRetriever ✅ Stable
Semantic Kernel pip install synapse-layer[semantic-kernel] from synapse_memory.integrations.semantic_kernel import SynapseChatHistory ✅ Stable
MCP No install needed forge.synapselayer.org/api/mcp ✅ Live

LangChain Example

from synapse_memory.integrations import SynapseChatMessageHistory

history = SynapseChatMessageHistory(agent_id="my-agent", session_id="session-1")
history.add_user_message("I prefer concise answers.")
history.add_ai_message("Got it — I'll keep it brief.")

# Works with RunnableWithMessageHistory:
# chain_with_history = RunnableWithMessageHistory(
#     runnable=your_chain,
#     get_session_history=lambda sid: SynapseChatMessageHistory(agent_id="agent", session_id=sid),
# )

CrewAI Example

from synapse_memory.integrations.crewai_memory import SynapseCrewStorage

storage = SynapseCrewStorage(agent_id="research-crew")
# Use with CrewAI: Memory(storage=storage)

See examples/ for full working examples.


🔐 Encryption

from synapse_memory import SynapseCrypto

key = SynapseCrypto.generate_key()
crypto = SynapseCrypto(key)

ciphertext = crypto.encrypt("sensitive memory")
plaintext = crypto.decrypt(ciphertext)  # AES-256-GCM

# Or from environment
crypto = SynapseCrypto.from_env("SYNAPSE_ENCRYPTION_KEY")

🏗️ Storage Backends

from synapse_memory import SynapseMemory, SqliteBackend, MemoryBackend

memory = SynapseMemory(agent_id="test")                         # In-memory (default)
memory = SynapseMemory(agent_id="prod", backend=SqliteBackend()) # SQLite (persistent)
memory = SynapseMemory(agent_id="x", backend=MyBackend())        # Custom (implement StorageBackend)

🔍 Plugin Architecture

Clean OSS/PRO separation via the Strategy pattern:

from synapse_memory import AutoSaveEngine

engine = AutoSaveEngine(database=db, redactor=redact)             # OSS
engine = AutoSaveEngine(database=db, redactor=redact, mode="pro") # PRO (auto-loads synapse-layer-pro)
engine = AutoSaveEngine(database=db, importance_scorer=MyScorer()) # Custom strategies

Interfaces: ImportanceScorer, ConflictResolver, DedupStrategy, RedactionStrategy


🏆 Competitive Comparison

Capability Synapse Layer Mem0 Zep pgvector
AES-256-GCM Encryption
PII Redaction (15+ patterns)
Differential Privacy
Intent Validation + Self-Healing
Cross-Model Handover (JWT) partial
Trust Quotient™ Scoring
MCP Native (Official Registry)
Pluggable Storage Backends
Plugin Architecture
Zero-Knowledge Architecture

📊 v1.1.7 Numbers

  • 481 tests | 90% coverage
  • 5 framework integrations (LangChain, CrewAI, AutoGen, LlamaIndex, Semantic Kernel)
  • 5 MCP tools (real DB, not stubs)
  • 2 storage backends (Memory, SQLite) + custom protocol
  • AES-256-GCM with PBKDF2 key derivation (600k iterations)

🌐 Open Core Model

  • Community (Apache 2.0) — Full SDK, security pipeline, MCP integration, all backends, all integrations.
  • Enterprise — Advanced TQ calibration, multi-tenant vaults, production infrastructure.

🛣️ Roadmap

Version Status Highlights
v1.1.x Stable SqliteBackend, AES-256-GCM, @remember, 5 MCP tools, 5 framework integrations, 481 tests
v1.2.0 🚧 Next Embedding model selection, vector similarity search, batch operations
v2.0.0 📋 Planned Multi-tenant vault, team memory spaces, RBAC

🤝 Contributing

git clone https://github.com/SynapseLayer/synapse-layer.git
cd synapse-layer
pip install -e ".[dev]"
python -m pytest tests/ -q  # 481 tests

See CONTRIBUTING.md for guidelines.


📎 Connect


License

Apache License 2.0 — see LICENSE.

Open-core model: SDK, MCP server, and security pipeline are fully open source. Trust Quotient™ weights, Neural Handover™ internals, and Synapse Forge are proprietary.


Star Synapse Layer — Give your agents a past.



Giving Agents a Past. Giving Models a Soul. ⚗️



Built by Ismael Marchi · @synapselayer

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

synapse_layer-1.1.8.tar.gz (90.6 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

synapse_layer-1.1.8-py3-none-any.whl (83.0 kB view details)

Uploaded Python 3

File details

Details for the file synapse_layer-1.1.8.tar.gz.

File metadata

  • Download URL: synapse_layer-1.1.8.tar.gz
  • Upload date:
  • Size: 90.6 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.11.6

File hashes

Hashes for synapse_layer-1.1.8.tar.gz
Algorithm Hash digest
SHA256 1cac873e0a6d3835f301532b0f7bc2709cb29f0543b4bc16dbd2aa4e3a24dbad
MD5 8643d8376984348b3ba3232fa8bb3c0b
BLAKE2b-256 b5588ba62d62b955895ae46397070373fd8ea57bc779dbc2805c909eb1e8d046

See more details on using hashes here.

File details

Details for the file synapse_layer-1.1.8-py3-none-any.whl.

File metadata

  • Download URL: synapse_layer-1.1.8-py3-none-any.whl
  • Upload date:
  • Size: 83.0 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.11.6

File hashes

Hashes for synapse_layer-1.1.8-py3-none-any.whl
Algorithm Hash digest
SHA256 db5442707bc0ed01a22df869f19ee77728c11c2fee07c58f314a2cb6f3f2de5c
MD5 7f845de44b5f26095a6eea4e50364e83
BLAKE2b-256 4c66bb150fc58606e637799dfb2a61df638fa678280ea0943b0a601bd647d8fe

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page