Skip to main content

Long-term memory for AI Agents

Project description

Mem0 - The Memory Layer for Personalized AI

mem0ai%2Fmem0 | Trendshift

Learn more · Join Discord · Demo · OpenMemory

Mem0 Discord Mem0 PyPI - Downloads GitHub commit activity Package version Npm package Y Combinator S24

📄 Building Production-Ready AI Agents with Scalable Long-Term Memory →

⚡ +26% Accuracy vs. OpenAI Memory • 🚀 91% Faster • 💰 90% Fewer Tokens

🔥 Research Highlights

  • +26% Accuracy over OpenAI Memory on the LOCOMO benchmark
  • 91% Faster Responses than full-context, ensuring low-latency at scale
  • 90% Lower Token Usage than full-context, cutting costs without compromise
  • Read the full paper

Introduction

Mem0 ("mem-zero") enhances AI assistants and agents with an intelligent memory layer, enabling personalized AI interactions. It remembers user preferences, adapts to individual needs, and continuously learns over time—ideal for customer support chatbots, AI assistants, and autonomous systems.

Key Features & Use Cases

Core Capabilities:

  • Multi-Level Memory: Seamlessly retains User, Session, and Agent state with adaptive personalization
  • Developer-Friendly: Intuitive API, cross-platform SDKs, and a fully managed service option

Applications:

  • AI Assistants: Consistent, context-rich conversations
  • Customer Support: Recall past tickets and user history for tailored help
  • Healthcare: Track patient preferences and history for personalized care
  • Productivity & Gaming: Adaptive workflows and environments based on user behavior

🚀 Quickstart Guide

Choose between our hosted platform or self-hosted package:

Hosted Platform

Get up and running in minutes with automatic updates, analytics, and enterprise security.

  1. Sign up on Mem0 Platform
  2. Embed the memory layer via SDK or API keys

Self-Hosted (Open Source)

Install the sdk via pip:

pip install mem0ai

Install sdk via npm:

npm install mem0ai

Basic Usage

Mem0 requires an LLM to function, with gpt-4o-mini from OpenAI as the default. However, it supports a variety of LLMs; for details, refer to our Supported LLMs documentation.

First step is to instantiate the memory:

from openai import OpenAI
from mem0 import Memory

openai_client = OpenAI()
memory = Memory()

def chat_with_memories(message: str, user_id: str = "default_user") -> str:
    # Retrieve relevant memories
    relevant_memories = memory.search(query=message, user_id=user_id, limit=3)
    memories_str = "\n".join(f"- {entry['memory']}" for entry in relevant_memories["results"])

    # Generate Assistant response
    system_prompt = f"You are a helpful AI. Answer the question based on query and memories.\nUser Memories:\n{memories_str}"
    messages = [{"role": "system", "content": system_prompt}, {"role": "user", "content": message}]
    response = openai_client.chat.completions.create(model="gpt-4o-mini", messages=messages)
    assistant_response = response.choices[0].message.content

    # Create new memories from the conversation
    messages.append({"role": "assistant", "content": assistant_response})
    memory.add(messages, user_id=user_id)

    return assistant_response

def main():
    print("Chat with AI (type 'exit' to quit)")
    while True:
        user_input = input("You: ").strip()
        if user_input.lower() == 'exit':
            print("Goodbye!")
            break
        print(f"AI: {chat_with_memories(user_input)}")

if __name__ == "__main__":
    main()

For detailed integration steps, see the Quickstart and API Reference.

🔗 Integrations & Demos

  • ChatGPT with Memory: Personalized chat powered by Mem0 (Live Demo)
  • Browser Extension: Store memories across ChatGPT, Perplexity, and Claude (Chrome Extension)
  • Langgraph Support: Build a customer bot with Langgraph + Mem0 (Guide)
  • CrewAI Integration: Tailor CrewAI outputs with Mem0 (Example)

📚 Documentation & Support

Citation

We now have a paper you can cite:

@article{mem0,
  title={Mem0: Building Production-Ready AI Agents with Scalable Long-Term Memory},
  author={Chhikara, Prateek and Khant, Dev and Aryan, Saket and Singh, Taranjeet and Yadav, Deshraj},
  journal={arXiv preprint arXiv:2504.19413},
  year={2025}
}

⚖️ License

Apache 2.0 — see the LICENSE file for details.

Project details


Release history Release notifications | RSS feed

Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

mem0ai-0.1.114.tar.gz (113.7 kB view details)

Uploaded Source

Built Distribution

mem0ai-0.1.114-py3-none-any.whl (174.8 kB view details)

Uploaded Python 3

File details

Details for the file mem0ai-0.1.114.tar.gz.

File metadata

  • Download URL: mem0ai-0.1.114.tar.gz
  • Upload date:
  • Size: 113.7 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.12.9

File hashes

Hashes for mem0ai-0.1.114.tar.gz
Algorithm Hash digest
SHA256 b27886132eaec78544e8b8b54f0b14a36728f3c99da54cb7cb417150e2fad7e1
MD5 c618568c3d5b55cdfcad96800036f4e4
BLAKE2b-256 874781f43e173940d000694eb20a70c0a92149c53edd2095e34b618afa41ca7d

See more details on using hashes here.

Provenance

The following attestation bundles were made for mem0ai-0.1.114.tar.gz:

Publisher: cd.yml on mem0ai/mem0

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file mem0ai-0.1.114-py3-none-any.whl.

File metadata

  • Download URL: mem0ai-0.1.114-py3-none-any.whl
  • Upload date:
  • Size: 174.8 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.12.9

File hashes

Hashes for mem0ai-0.1.114-py3-none-any.whl
Algorithm Hash digest
SHA256 dfb7f0079ee282f5d9782e220f6f09707bcf5e107925d1901dbca30d8dd83f9b
MD5 14105c569fba75c26ced25d643b90d19
BLAKE2b-256 5eb750d1d1d0600e9e5a861e733644513816011504b9a3d0ba870eadb32a481f

See more details on using hashes here.

Provenance

The following attestation bundles were made for mem0ai-0.1.114-py3-none-any.whl:

Publisher: cd.yml on mem0ai/mem0

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page