Skip to main content

Human-like memory for AI — semantic, episodic & procedural. Experience-driven procedures, Cognitive Profile, unified search, memory agents. Free open-source Mem0 alternative.

Project description

Mengram

The memory layer for AI agents that learns from experience

Your agents remember facts, events, and workflows — and procedures improve automatically when they fail.

PyPI npm License: Apache 2.0 PyPI Downloads

Website · Get API Key · API Docs · Examples


Why Mengram?

Every AI memory tool stores facts. Mengram stores 3 types — and procedures evolve from failures.

Mengram Mem0 Letta Zep
Semantic Memory (facts)
Episodic Memory (events) Partial
Procedural Memory (workflows)
Experience-Driven Evolution
Cognitive Profile
Knowledge Graph
LangChain / CrewAI Partial
MCP Server
Price Free $19–249/mo Free (self-host) Enterprise

Quick Start

pip install mengram-ai
from cloud.client import CloudMemory

m = CloudMemory(api_key="om-...")  # Free key → mengram.io/dashboard

# Add a conversation — Mengram auto-extracts facts, events, and workflows
m.add([
    {"role": "user", "content": "Deployed to Railway today. Build passed but forgot migrations — DB crashed. Fixed by adding a pre-deploy check."},
])

# Search facts
m.search("deployment setup")

# Search events — what happened?
m.episodes(query="deployment")
# → [{summary: "Deployed to Railway, DB crashed due to missing migrations", outcome: "resolved", ...}]

# Search workflows — how to do it?
m.procedures(query="deploy")
# → [{name: "Deploy to Railway", steps: ["build", "run migrations", "push", "verify"], ...}]

# Unified search — all 3 types at once
m.search_all("deployment issues")
# → {semantic: [...], episodic: [...], procedural: [...]}

JavaScript / TypeScript:

npm install mengram-ai
const { MengramClient } = require('mengram-ai');
const m = new MengramClient('om-...');

await m.add([{ role: 'user', content: 'Fixed OOM with Redis cache' }]);
const all = await m.searchAll('database issues');
// → { semantic: [...], episodic: [...], procedural: [...] }

Experience-Driven Procedures

The feature no one else has. Procedures learn from real outcomes — not static runbooks.

Week 1:  "Deploy" → build → push → deploy
                                         ↓ FAILURE: forgot migrations, DB crashed
Week 2:  "Deploy" v2 → build → run migrations → push → deploy
                                                          ↓ FAILURE: OOM on Railway
Week 3:  "Deploy" v3 → build → run migrations → check memory → push → deploy ✅

This happens automatically when you report failures:

# Report failure with context → procedure evolves to a new version
m.procedure_feedback(proc_id, success=False,
                     context="OOM error on step 3", failed_at_step=3)

# View version history
history = m.procedure_history(proc_id)
# → {versions: [v1, v2, v3], evolution_log: [{change: "step_added", reason: "prevent OOM"}]}

Or fully automatic — add conversations and Mengram detects failures, links them to procedures, and evolves:

m.add([{"role": "user", "content": "Deploy to Railway failed again — OOM on the build step"}])
# → Episode auto-linked to "Deploy" procedure → failure detected → v3 created

Cognitive Profile

One API call generates a system prompt from all your memories:

profile = m.get_profile()
# → "You are talking to Ali, a developer in Almaty building Mengram.
#    He uses Python, PostgreSQL, and Railway. Recently debugged pgvector deployment.
#    Workflows: deploys via build→twine→npm→git. Communicate directly, focus on practical next steps."

Insert into any LLM's system prompt for instant personalization.

Integrations

MCP Server (Claude Desktop, Cursor, Windsurf)

{
  "mcpServers": {
    "mengram": {
      "command": "mengram",
      "args": ["server", "--cloud"],
      "env": { "MENGRAM_API_KEY": "om-..." }
    }
  }
}

LangChain

from integrations.langchain import MengramChatMessageHistory, MengramRetriever

# Drop-in message history — auto-saves to Mengram
history = MengramChatMessageHistory(api_key="om-...", session_id="session-1")

# RAG retriever — searches all 3 memory types
retriever = MengramRetriever(api_key="om-...")

CrewAI

from integrations.crewai import create_mengram_tools

tools = create_mengram_tools(api_key="om-...")
# → 5 tools: search, remember, profile, save_workflow, workflow_feedback

agent = Agent(role="Support Engineer", tools=tools)

Agent Templates

Ready-to-run examples — clone, set API key, run in 5 minutes:

Template Stack What it shows
DevOps Agent Python SDK Procedures that evolve from deployment failures
Customer Support CrewAI Agent with 5 memory tools, remembers returning customers
Personal Assistant LangChain Cognitive profile + auto-saving chat history
cd examples/devops-agent && pip install -r requirements.txt
export MENGRAM_API_KEY=om-...
python main.py

API Reference

All endpoints require Authorization: Bearer om-... — your key identifies you, no user_id needed.

Endpoint Description
POST /v1/add Add memories (auto-extracts all 3 types)
POST /v1/search Semantic search
POST /v1/search/all Unified search (all 3 types)
GET /v1/episodes/search Search episodic memories
GET /v1/procedures/search Search procedural memories
PATCH /v1/procedures/{id}/feedback Report success/failure → triggers evolution
GET /v1/procedures/{id}/history Version history + evolution log
GET /v1/profile Cognitive Profile
GET /v1/triggers Smart Triggers (reminders, contradictions, patterns)
POST /v1/agents/run Run memory agents (Curator, Connector, Digest)

Full interactive docs: mengram.io/docs

License

Apache 2.0 — free for commercial use.


Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

mengram_ai-2.7.3.tar.gz (125.3 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

mengram_ai-2.7.3-py3-none-any.whl (134.6 kB view details)

Uploaded Python 3

File details

Details for the file mengram_ai-2.7.3.tar.gz.

File metadata

  • Download URL: mengram_ai-2.7.3.tar.gz
  • Upload date:
  • Size: 125.3 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.12.7

File hashes

Hashes for mengram_ai-2.7.3.tar.gz
Algorithm Hash digest
SHA256 b97e4b5682d40bda869c718c385f36697f7c6f3acead1b4236b3d652e1e16719
MD5 b38e4129f8dd4171a6ab26fe97d5885b
BLAKE2b-256 c98b6af3ea584f18bed864f05001274834ae48f86eb2ceece1786c2330e75fcd

See more details on using hashes here.

File details

Details for the file mengram_ai-2.7.3-py3-none-any.whl.

File metadata

  • Download URL: mengram_ai-2.7.3-py3-none-any.whl
  • Upload date:
  • Size: 134.6 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.12.7

File hashes

Hashes for mengram_ai-2.7.3-py3-none-any.whl
Algorithm Hash digest
SHA256 38a7da643c3772ee415732b0e85e19908d98e74f2ff0ce58b9a91c5be7a08dd9
MD5 381e2301b056ea7bf06fd101698b7f16
BLAKE2b-256 4783384a9b65a3035a8ec154623608664b5f717015a9d1fe7cd7b733854fee60

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page