Self-evolving memory system for AI agents with semantic search and knowledge graph capabilities. Includes MCP server for Claude and other AI assistants.
Project description
A-MEM: Self-evolving memory for AI agents
mcp-name: io.github.DiaaAj/a-mem-mcp
A-MEM is a self-evolving memory system for AI agents. Unlike simple vector stores, A-MEM automatically organizes knowledge into a Zettelkasten-style graph with typed relationships. Memories don't just get stored—they evolve and connect over time.
Use it as a Python library or as an MCP server with Claude and other AI assistants.
Quick Start
Install
pip install a-mem
Configure with Claude Code
Step 1: Set up environment
Copy .env.example to .env and configure your API keys:
cp .env.example .env
# Edit .env with your API keys
Step 2: Add MCP server
Option A: CLI (Quick)
claude mcp add --transport stdio a-mem -- a-mem-mcp
Option B: JSON Config (For custom env vars)
Edit ~/.claude.json or .claude/settings.local.json:
{
"mcpServers": {
"a-mem": {
"command": "a-mem-mcp",
"env": {
"LLM_BACKEND": "openai",
"LLM_MODEL": "gpt-4o-mini",
"OPENAI_API_KEY": "sk-..."
}
}
}
}
Note: If you use a
.envfile, theenvsection in JSON is optional.
Memory Scope:
- Project-specific (default): Each project gets isolated memory
- Global: Share across projects by setting
CHROMA_DB_PATH=/home/user/.local/share/a-mem/chroma_dbin.env
Features
Self-Evolving Memory
Memories aren't static. When you add new knowledge, A-MEM automatically finds related memories and strengthens connections, updates context, and evolves tags.
Semantic + Structural Search
Combines vector similarity with graph traversal. Find memories by meaning, then explore their connections.
How It Works
t=0 t=1 t=2
◉───◉ ◉───◉
◉ │ ╱ │ ╲
◉ ◉──┼──◉
│
◉
━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━▶
self-evolving memory
- Add a memory → A-MEM extracts keywords, context, and tags via LLM
- Find neighbors → Searches for semantically similar existing memories
- Evolve → Decides whether to link, strengthen connections, or update related memories
- Store → Persists to ChromaDB with full metadata and relationships
The result: a knowledge graph that grows smarter over time, not just bigger.
Configuration
Environment Variables
| Variable | Description | Default |
|---|---|---|
LLM_BACKEND |
openai, ollama, sglang, openrouter |
openai |
LLM_MODEL |
Model name | gpt-4o-mini |
OPENAI_API_KEY |
OpenAI API key | — |
EMBEDDING_MODEL |
Sentence transformer model | all-MiniLM-L6-v2 |
CHROMA_DB_PATH |
Storage directory | ./chroma_db |
EVO_THRESHOLD |
Evolution trigger threshold | 100 |
Using Different Backends
Ollama (local, free)
export LLM_BACKEND=ollama
export LLM_MODEL=llama2
OpenRouter (100+ models)
export LLM_BACKEND=openrouter
export LLM_MODEL=anthropic/claude-3.5-sonnet
export OPENROUTER_API_KEY=sk-or-...
MCP Tools
A-MEM exposes 6 tools to your AI agent:
| Tool | Description |
|---|---|
add_memory_note |
Store new knowledge (async, returns immediately) |
search_memories |
Semantic search across all memories |
search_memories_agentic |
Search + follow graph connections |
read_memory_note |
Get full details of a specific memory |
update_memory_note |
Modify existing memory |
delete_memory_note |
Remove a memory |
Example Usage
# The agent calls these automatically, but here's what happens:
# Store a memory (returns task_id immediately)
add_memory_note(content="Auth uses JWT in httpOnly cookies, validated by AuthMiddleware")
# Search later
search_memories(query="authentication flow", k=5)
# Deep search with connections
search_memories_agentic(query="security", k=5)
Python API
Use A-MEM directly in Python:
from agentic_memory.memory_system import AgenticMemorySystem
memory = AgenticMemorySystem(
llm_backend="openai",
llm_model="gpt-4o-mini"
)
# Add (auto-generates keywords, tags, context)
memory_id = memory.add_note("FastAPI app uses dependency injection for DB sessions")
# Search
results = memory.search("database patterns", k=5)
# Read full details
note = memory.read(memory_id)
print(note.keywords, note.tags, note.links)
Research
A-MEM implements concepts from the paper:
A-MEM: Agentic Memory for LLM Agents
Xu et al., 2025
arXiv:2502.12110
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file a_mem-0.2.1.tar.gz.
File metadata
- Download URL: a_mem-0.2.1.tar.gz
- Upload date:
- Size: 42.4 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.2.0 CPython/3.13.7
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
bbaa332e51fde41dd79610dd18f55e287438e5b1a909b4f18e5ffb84bc9dbe70
|
|
| MD5 |
5afdecfddebaf03dc05a237b3415134e
|
|
| BLAKE2b-256 |
3f1ffe70a7389b7c50961bafa2bf0fa6d99fb5fb25952a60668ddfcf0a0d15a7
|
File details
Details for the file a_mem-0.2.1-py3-none-any.whl.
File metadata
- Download URL: a_mem-0.2.1-py3-none-any.whl
- Upload date:
- Size: 40.3 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.2.0 CPython/3.13.7
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
621755e50e47e59d968a37ab17e1d9c1bdd2f661eca6b44f168fc6550e4c1989
|
|
| MD5 |
f2bb7a5ab83e4f6b5353b04487c89956
|
|
| BLAKE2b-256 |
7de589b59eb6c8f771e4e40d798ec5851c29bd5b0592403e2bf6a16f2821e421
|