Long-term memory for AI Agents
Project description
Learn more · Join Discord · Demo · OpenMemory
📄 Building Production-Ready AI Agents with Scalable Long-Term Memory →
⚡ +26% Accuracy vs. OpenAI Memory • 🚀 91% Faster • 💰 90% Fewer Tokens
🔥 Research Highlights
- +26% Accuracy over OpenAI Memory on the LOCOMO benchmark
- 91% Faster Responses than full-context, ensuring low-latency at scale
- 90% Lower Token Usage than full-context, cutting costs without compromise
- Read the full paper
Introduction
Mem0 ("mem-zero") enhances AI assistants and agents with an intelligent memory layer, enabling personalized AI interactions. It remembers user preferences, adapts to individual needs, and continuously learns over time—ideal for customer support chatbots, AI assistants, and autonomous systems.
Key Features & Use Cases
Core Capabilities:
- Multi-Level Memory: Seamlessly retains User, Session, and Agent state with adaptive personalization
- Developer-Friendly: Intuitive API, cross-platform SDKs, and a fully managed service option
Applications:
- AI Assistants: Consistent, context-rich conversations
- Customer Support: Recall past tickets and user history for tailored help
- Healthcare: Track patient preferences and history for personalized care
- Productivity & Gaming: Adaptive workflows and environments based on user behavior
🚀 Quickstart Guide
Choose between our hosted platform or self-hosted package:
Hosted Platform
Get up and running in minutes with automatic updates, analytics, and enterprise security.
- Sign up on Mem0 Platform
- Embed the memory layer via SDK or API keys
Self-Hosted (Open Source)
Install the sdk via pip:
pip install mem0ai
Install sdk via npm:
npm install mem0ai
Basic Usage
Mem0 requires an LLM to function, with gpt-4o-mini
from OpenAI as the default. However, it supports a variety of LLMs; for details, refer to our Supported LLMs documentation.
First step is to instantiate the memory:
from openai import OpenAI
from mem0 import Memory
openai_client = OpenAI()
memory = Memory()
def chat_with_memories(message: str, user_id: str = "default_user") -> str:
# Retrieve relevant memories
relevant_memories = memory.search(query=message, user_id=user_id, limit=3)
memories_str = "\n".join(f"- {entry['memory']}" for entry in relevant_memories["results"])
# Generate Assistant response
system_prompt = f"You are a helpful AI. Answer the question based on query and memories.\nUser Memories:\n{memories_str}"
messages = [{"role": "system", "content": system_prompt}, {"role": "user", "content": message}]
response = openai_client.chat.completions.create(model="gpt-4o-mini", messages=messages)
assistant_response = response.choices[0].message.content
# Create new memories from the conversation
messages.append({"role": "assistant", "content": assistant_response})
memory.add(messages, user_id=user_id)
return assistant_response
def main():
print("Chat with AI (type 'exit' to quit)")
while True:
user_input = input("You: ").strip()
if user_input.lower() == 'exit':
print("Goodbye!")
break
print(f"AI: {chat_with_memories(user_input)}")
if __name__ == "__main__":
main()
For detailed integration steps, see the Quickstart and API Reference.
🔗 Integrations & Demos
- ChatGPT with Memory: Personalized chat powered by Mem0 (Live Demo)
- Browser Extension: Store memories across ChatGPT, Perplexity, and Claude (Chrome Extension)
- Langgraph Support: Build a customer bot with Langgraph + Mem0 (Guide)
- CrewAI Integration: Tailor CrewAI outputs with Mem0 (Example)
📚 Documentation & Support
- Full docs: https://docs.mem0.ai
- Community: Discord · Twitter
- Contact: founders@mem0.ai
Citation
We now have a paper you can cite:
@article{mem0,
title={Mem0: Building Production-Ready AI Agents with Scalable Long-Term Memory},
author={Chhikara, Prateek and Khant, Dev and Aryan, Saket and Singh, Taranjeet and Yadav, Deshraj},
journal={arXiv preprint arXiv:2504.19413},
year={2025}
}
⚖️ License
Apache 2.0 — see the LICENSE file for details.
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
File details
Details for the file mem0ai-0.1.114.tar.gz
.
File metadata
- Download URL: mem0ai-0.1.114.tar.gz
- Upload date:
- Size: 113.7 kB
- Tags: Source
- Uploaded using Trusted Publishing? Yes
- Uploaded via: twine/6.1.0 CPython/3.12.9
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 |
b27886132eaec78544e8b8b54f0b14a36728f3c99da54cb7cb417150e2fad7e1
|
|
MD5 |
c618568c3d5b55cdfcad96800036f4e4
|
|
BLAKE2b-256 |
874781f43e173940d000694eb20a70c0a92149c53edd2095e34b618afa41ca7d
|
Provenance
The following attestation bundles were made for mem0ai-0.1.114.tar.gz
:
Publisher:
cd.yml
on mem0ai/mem0
-
Statement:
-
Statement type:
https://in-toto.io/Statement/v1
-
Predicate type:
https://docs.pypi.org/attestations/publish/v1
-
Subject name:
mem0ai-0.1.114.tar.gz
-
Subject digest:
b27886132eaec78544e8b8b54f0b14a36728f3c99da54cb7cb417150e2fad7e1
- Sigstore transparency entry: 264063289
- Sigstore integration time:
-
Permalink:
mem0ai/mem0@c0a930a7d3c4861e3917f17674bea6665f6ded95
-
Branch / Tag:
refs/tags/v0.1.114
- Owner: https://github.com/mem0ai
-
Access:
public
-
Token Issuer:
https://token.actions.githubusercontent.com
-
Runner Environment:
github-hosted
-
Publication workflow:
cd.yml@c0a930a7d3c4861e3917f17674bea6665f6ded95
-
Trigger Event:
release
-
Statement type:
File details
Details for the file mem0ai-0.1.114-py3-none-any.whl
.
File metadata
- Download URL: mem0ai-0.1.114-py3-none-any.whl
- Upload date:
- Size: 174.8 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? Yes
- Uploaded via: twine/6.1.0 CPython/3.12.9
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 |
dfb7f0079ee282f5d9782e220f6f09707bcf5e107925d1901dbca30d8dd83f9b
|
|
MD5 |
14105c569fba75c26ced25d643b90d19
|
|
BLAKE2b-256 |
5eb750d1d1d0600e9e5a861e733644513816011504b9a3d0ba870eadb32a481f
|
Provenance
The following attestation bundles were made for mem0ai-0.1.114-py3-none-any.whl
:
Publisher:
cd.yml
on mem0ai/mem0
-
Statement:
-
Statement type:
https://in-toto.io/Statement/v1
-
Predicate type:
https://docs.pypi.org/attestations/publish/v1
-
Subject name:
mem0ai-0.1.114-py3-none-any.whl
-
Subject digest:
dfb7f0079ee282f5d9782e220f6f09707bcf5e107925d1901dbca30d8dd83f9b
- Sigstore transparency entry: 264063291
- Sigstore integration time:
-
Permalink:
mem0ai/mem0@c0a930a7d3c4861e3917f17674bea6665f6ded95
-
Branch / Tag:
refs/tags/v0.1.114
- Owner: https://github.com/mem0ai
-
Access:
public
-
Token Issuer:
https://token.actions.githubusercontent.com
-
Runner Environment:
github-hosted
-
Publication workflow:
cd.yml@c0a930a7d3c4861e3917f17674bea6665f6ded95
-
Trigger Event:
release
-
Statement type: