Skip to main content

Self-evolving memory system for AI agents with semantic search and knowledge graph capabilities. Includes MCP server for Claude and other AI assistants.

Project description

A-MEM: Self-evolving memory for coding agents

PyPI version PyPI downloads MCP Registry

mcp-name: io.github.DiaaAj/a-mem-mcp

A-MEM is a self-evolving memory system for coding agents. Unlike simple vector stores, A-MEM automatically organizes knowledge into a Zettelkasten-style graph with dynamic relationships. Memories don't just get stored—they evolve and connect over time.

Currently tested with Claude Code. Support for other MCP-compatible agents is planned.

Quick Start

Install

pip install a-mem

Add to Claude Code

claude mcp add a-mem -s user -- a-mem-mcp \
  -e LLM_BACKEND=openai \
  -e LLM_MODEL=gpt-4o-mini \
  -e OPENAI_API_KEY=sk-...

That's it! A session-start hook installs automatically to remind Claude to use memory.

Note: Memory is stored per-project in ./chroma_db. For global memory across all projects, see Memory Scope.

Uninstall

a-mem-uninstall-hook   # Remove hooks first
pip uninstall a-mem

How It Works

t=0              t=1                t=2

                 ◉───◉             ◉───◉
 ◉               │                 ╱ │ ╲
                 ◉                ◉──┼──◉
                                     │
                                     ◉

━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━▶
            self-evolving memory
  1. Add a memory → A-MEM extracts keywords, context, and tags via LLM
  2. Find neighbors → Searches for semantically similar existing memories
  3. Evolve → Decides whether to link, strengthen connections, or update related memories
  4. Store → Persists to ChromaDB with full metadata and relationships

The result: a knowledge graph that grows smarter over time, not just bigger.

Features

Self-Evolving Memory Memories aren't static. When you add new knowledge, A-MEM automatically finds related memories and strengthens connections, updates context, and evolves tags.

Semantic + Structural Search Combines vector similarity with graph traversal. Find memories by meaning, then explore their connections.

MCP Tools

A-MEM exposes 6 tools to your coding agent:

Tool Description
add_memory_note Store new knowledge (async, returns immediately)
search_memories Semantic search across all memories
search_memories_agentic Search + follow graph connections
read_memory_note Get full details of a specific memory
update_memory_note Modify existing memory
delete_memory_note Remove a memory

Example Usage

# The agent calls these automatically, but here's what happens:

# Store a memory (returns task_id immediately)
add_memory_note(content="Auth uses JWT in httpOnly cookies, validated by AuthMiddleware")

# Search later
search_memories(query="authentication flow", k=5)

# Deep search with connections
search_memories_agentic(query="security", k=5)

Advanced Configuration

JSON Config

For more control, edit ~/.claude/settings.json (global) or .claude/settings.local.json (project):

{
  "mcpServers": {
    "a-mem": {
      "command": "a-mem-mcp",
      "env": {
        "LLM_BACKEND": "openai",
        "LLM_MODEL": "gpt-4o-mini",
        "OPENAI_API_KEY": "sk-..."
      }
    }
  }
}

Environment Variables

Variable Description Default
LLM_BACKEND openai, ollama, sglang, openrouter openai
LLM_MODEL Model name gpt-4o-mini
OPENAI_API_KEY OpenAI API key
EMBEDDING_MODEL Sentence transformer model all-MiniLM-L6-v2
CHROMA_DB_PATH Storage directory ./chroma_db
EVO_THRESHOLD Evolution trigger threshold 100

Memory Scope

  • Project-specific (default): Each project gets isolated memory in ./chroma_db
  • Global: Share across projects by setting CHROMA_DB_PATH=~/.local/share/a-mem/chroma_db

Alternative Backends

Ollama (local, free)

claude mcp add a-mem -s user -- a-mem-mcp \
  -e LLM_BACKEND=ollama \
  -e LLM_MODEL=llama2

OpenRouter (100+ models)

claude mcp add a-mem -s user -- a-mem-mcp \
  -e LLM_BACKEND=openrouter \
  -e LLM_MODEL=anthropic/claude-3.5-sonnet \
  -e OPENROUTER_API_KEY=sk-or-...

Hook Management (Claude Code)

The session-start hook reminds Claude to use memory tools. It installs automatically with Claude Code, but you can manage it manually:

a-mem-install-hook     # Install/reinstall hook
a-mem-uninstall-hook   # Remove hook completely

Python API

Use A-MEM directly in Python (works with any agent or application):

from agentic_memory.memory_system import AgenticMemorySystem

memory = AgenticMemorySystem(
    llm_backend="openai",
    llm_model="gpt-4o-mini"
)

# Add (auto-generates keywords, tags, context)
memory_id = memory.add_note("FastAPI app uses dependency injection for DB sessions")

# Search
results = memory.search("database patterns", k=5)

# Read full details
note = memory.read(memory_id)
print(note.keywords, note.tags, note.links)

Research

A-MEM implements concepts from the paper:

A-MEM: Agentic Memory for LLM Agents Xu et al., 2025 arXiv:2502.12110

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

a_mem-0.2.4.tar.gz (43.3 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

a_mem-0.2.4-py3-none-any.whl (41.1 kB view details)

Uploaded Python 3

File details

Details for the file a_mem-0.2.4.tar.gz.

File metadata

  • Download URL: a_mem-0.2.4.tar.gz
  • Upload date:
  • Size: 43.3 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.13.7

File hashes

Hashes for a_mem-0.2.4.tar.gz
Algorithm Hash digest
SHA256 2329c9e62fff5cbbad12837f740a5b9f0a203347fe7eb2e991809966c1781cb7
MD5 47e011563cf15ce69b5f39d0f6eadd68
BLAKE2b-256 b6454a63357d1a9a82d449f71701a7dfc2d438101eb196b27c191b9ba430e4d7

See more details on using hashes here.

File details

Details for the file a_mem-0.2.4-py3-none-any.whl.

File metadata

  • Download URL: a_mem-0.2.4-py3-none-any.whl
  • Upload date:
  • Size: 41.1 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.13.7

File hashes

Hashes for a_mem-0.2.4-py3-none-any.whl
Algorithm Hash digest
SHA256 c7ef2b6f5cfe0583c9631f37b2eaab8a05f9185ead5bff541846bf1af2dc6c96
MD5 37a3683dd10de1d4daf683bc803f8636
BLAKE2b-256 c177fac47856865c53a1fe756953e9ef2eef8c91d23e7f58e8d06b47116b34d1

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page