Skip to main content

Human-like memory for AI — auto-save, auto-recall, cognitive profile. Claude Code hooks, MCP server (29 tools), semantic/episodic/procedural memory. Free open-source Mem0 alternative.

Project description

Mengram

Give your AI agents memory that actually learns

PyPI npm License: Apache 2.0 PyPI Downloads

Website · Get API Key · Docs · Console · Examples

pip install mengram-ai   # or: npm install mengram-ai
from mengram import Mengram
m = Mengram(api_key="om-...")           # Free key → mengram.io

m.add([{"role": "user", "content": "I use Python and deploy to Railway"}])
m.search("tech stack")                  # → facts
m.episodes(query="deployment")          # → events
m.procedures(query="deploy")            # → workflows that evolve from failures

Why Mengram?

Every AI memory tool stores facts. Mengram stores 3 types of memory — and procedures evolve when they fail.

Mengram Mem0 Zep Letta
Semantic memory (facts, preferences) Yes Yes Yes Yes
Episodic memory (events, decisions) Yes No No Partial
Procedural memory (workflows) Yes No No No
Procedures evolve from failures Yes No No No
Cognitive Profile Yes No No No
Multi-user isolation Yes Yes Yes No
Knowledge graph Yes Yes Yes Yes
LangChain + CrewAI + MCP Yes Partial Partial Partial
Import ChatGPT / Obsidian Yes No No No
Pricing Free tier $19-249/mo Enterprise Self-host

Get Started in 30 Seconds

1. Get a free API key at mengram.io (email or GitHub)

2. Install

pip install mengram-ai

3. Use

from mengram import Mengram

m = Mengram(api_key="om-...")

# Add a conversation — auto-extracts facts, events, and workflows
m.add([
    {"role": "user", "content": "Deployed to Railway today. Build passed but forgot migrations — DB crashed. Fixed by adding a pre-deploy check."},
])

# Search across all 3 memory types at once
results = m.search_all("deployment issues")
# → {semantic: [...], episodic: [...], procedural: [...]}
JavaScript / TypeScript
npm install mengram-ai
const { MengramClient } = require('mengram-ai');
const m = new MengramClient('om-...');

await m.add([{ role: 'user', content: 'Fixed OOM by adding Redis cache layer' }]);
const results = await m.searchAll('database issues');
// → { semantic: [...], episodic: [...], procedural: [...] }
REST API (curl)
# Add memory
curl -X POST https://mengram.io/v1/add \
  -H "Authorization: Bearer om-..." \
  -H "Content-Type: application/json" \
  -d '{"messages": [{"role": "user", "content": "I prefer dark mode and vim keybindings"}]}'

# Search all 3 types
curl -X POST https://mengram.io/v1/search/all \
  -H "Authorization: Bearer om-..." \
  -d '{"query": "user preferences"}'

3 Memory Types

Semantic — facts, preferences, knowledge

m.search("tech stack")
# → ["Uses Python 3.12", "Deploys to Railway", "PostgreSQL with pgvector"]

Episodic — events, decisions, outcomes

m.episodes(query="deployment")
# → [{summary: "DB crashed due to missing migrations", outcome: "resolved", date: "2025-05-12"}]

Procedural — workflows that evolve

Week 1:  "Deploy" → build → push → deploy
                                         ↓ FAILURE: forgot migrations
Week 2:  "Deploy" v2 → build → run migrations → push → deploy
                                                          ↓ FAILURE: OOM
Week 3:  "Deploy" v3 → build → run migrations → check memory → push → deploy ✅

This happens automatically when you report failures:

m.procedure_feedback(proc_id, success=False,
                     context="OOM error on step 3", failed_at_step=3)
# → Procedure evolves to v3 with new step added

Or fully automatic — just add conversations and Mengram detects failures and evolves procedures:

m.add([{"role": "user", "content": "Deploy failed again — OOM on the build step"}])
# → Episode created → linked to "Deploy" procedure → failure detected → v3 created

Cognitive Profile

One API call generates a system prompt from all memories:

profile = m.get_profile()
# → "You are talking to Ali, a developer in Almaty. Uses Python, PostgreSQL,
#    and Railway. Recently debugged pgvector deployment. Prefers direct
#    communication and practical next steps."

Insert into any LLM's system prompt for instant personalization.

Import Existing Data

Kill the cold-start problem:

mengram import chatgpt ~/Downloads/chatgpt-export.zip --cloud   # ChatGPT history
mengram import obsidian ~/Documents/MyVault --cloud              # Obsidian vault
mengram import files notes/*.md --cloud                          # Any text/markdown

Integrations

MCP Server — Claude Desktop, Cursor, Windsurf

{
  "mcpServers": {
    "mengram": {
      "command": "mengram",
      "args": ["server", "--cloud"],
      "env": { "MENGRAM_API_KEY": "om-..." }
    }
  }
}

29 tools for memory management.

LangChainpip install langchain-mengram

from langchain_mengram import (
    MengramRetriever,
    MengramChatMessageHistory,
)

retriever = MengramRetriever(api_key="om-...")
docs = retriever.invoke("deployment issues")

CrewAI

from integrations.crewai import create_mengram_tools

tools = create_mengram_tools(api_key="om-...")
# → 5 tools: search, remember, profile,
#   save_workflow, workflow_feedback

agent = Agent(role="Support", tools=tools)

OpenClaw

openclaw plugins install openclaw-mengram

Auto-recall before every turn, auto-capture after. 12 tools, slash commands, Graph RAG.

GitHub · npm

Multi-User Isolation

One API key, many users — each sees only their own data:

m.add([...], user_id="alice")
m.add([...], user_id="bob")

m.search_all("preferences", user_id="alice")  # Only Alice's memories
m.get_profile(user_id="alice")                 # Alice's cognitive profile

Async Client

Non-blocking Python client built on httpx:

from mengram import AsyncMengram

async with AsyncMengram() as m:
    await m.add([{"role": "user", "content": "I use async/await"}])
    results = await m.search("async")
    profile = await m.get_profile()

Install with pip install mengram-ai[async].

Metadata Filters

Filter search results by metadata:

results = m.search("config", filters={"agent_id": "support-bot", "app_id": "prod"})

Webhooks

Get notified when memories change:

m.create_webhook(
    url="https://your-app.com/hook",
    event_types=["memory_add", "memory_update"],
)

Agent Templates

Clone, set API key, run in 5 minutes:

Template Stack What it shows
DevOps Agent Python SDK Procedures that evolve from deployment failures
Customer Support CrewAI Agent with 5 memory tools, remembers returning customers
Personal Assistant LangChain Cognitive profile + auto-saving chat history
cd examples/devops-agent && pip install -r requirements.txt
export MENGRAM_API_KEY=om-...
python main.py

API Reference

Endpoint Description
POST /v1/add Add memories (auto-extracts all 3 types)
POST /v1/search Semantic search
POST /v1/search/all Unified search (semantic + episodic + procedural)
GET /v1/episodes/search Search events and decisions
GET /v1/procedures/search Search workflows
PATCH /v1/procedures/{id}/feedback Report outcome — triggers evolution
GET /v1/procedures/{id}/history Version history + evolution log
GET /v1/profile Cognitive Profile
GET /v1/triggers Smart Triggers (reminders, contradictions, patterns)
POST /v1/agents/run Memory agents (Curator, Connector, Digest)
GET /v1/me Account info

Full interactive docs: mengram.io/docs

Community

License

Apache 2.0 — free for commercial use.


Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

mengram_ai-2.16.0.tar.gz (214.5 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

mengram_ai-2.16.0-py3-none-any.whl (221.7 kB view details)

Uploaded Python 3

File details

Details for the file mengram_ai-2.16.0.tar.gz.

File metadata

  • Download URL: mengram_ai-2.16.0.tar.gz
  • Upload date:
  • Size: 214.5 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.12.7

File hashes

Hashes for mengram_ai-2.16.0.tar.gz
Algorithm Hash digest
SHA256 7246ca8d3150c7120f235519cc2a85650e60aa62d9152ee9731789b2854f3900
MD5 9d0db6fc3c943ef4d965f592a1d2d875
BLAKE2b-256 571e08b9831cb09be7691ed8f6a25fc5e7da421f227db4c0ac9798e91652615f

See more details on using hashes here.

File details

Details for the file mengram_ai-2.16.0-py3-none-any.whl.

File metadata

  • Download URL: mengram_ai-2.16.0-py3-none-any.whl
  • Upload date:
  • Size: 221.7 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.12.7

File hashes

Hashes for mengram_ai-2.16.0-py3-none-any.whl
Algorithm Hash digest
SHA256 764f08a9f2ac6c0f90407114d292d260dd6e2f539103dc0bee917c668247d4db
MD5 70d7841df0d842468c903ac97e78ec43
BLAKE2b-256 e4f48b03853e92a66ffd77f4acbc850d3d48ee4ccadc6a00d896132897d4fa80

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page