Skip to main content

Long-term memory for AI Agents

Project description

Mem0 - The Memory Layer for Personalized AI

Launch

Learn more · Join Discord

Mem0 Discord Mem0 PyPI - Downloads Package version Npm package Y Combinator S24

Introduction

Mem0 (pronounced as "mem-zero") enhances AI assistants and agents with an intelligent memory layer, enabling personalized AI interactions. Mem0 remembers user preferences, adapts to individual needs, and continuously improves over time, making it ideal for customer support chatbots, AI assistants, and autonomous systems.

New Feature: Introducing Graph Memory. Check out our documentation.

Core Features

  • Multi-Level Memory: User, Session, and AI Agent memory retention
  • Adaptive Personalization: Continuous improvement based on interactions
  • Developer-Friendly API: Simple integration into various applications
  • Cross-Platform Consistency: Uniform behavior across devices
  • Managed Service: Hassle-free hosted solution

How Mem0 works?

Mem0 leverages a hybrid database approach to manage and retrieve long-term memories for AI agents and assistants. Each memory is associated with a unique identifier, such as a user ID or agent ID, allowing Mem0 to organize and access memories specific to an individual or context.

When a message is added to the Mem0 using add() method, the system extracts relevant facts and preferences and stores it across data stores: a vector database, a key-value database, and a graph database. This hybrid approach ensures that different types of information are stored in the most efficient manner, making subsequent searches quick and effective.

When an AI agent or LLM needs to recall memories, it uses the search() method. Mem0 then performs search across these data stores, retrieving relevant information from each source. This information is then passed through a scoring layer, which evaluates their importance based on relevance, importance, and recency. This ensures that only the most personalized and useful context is surfaced.

The retrieved memories can then be appended to the LLM's prompt as needed, enhancing the personalization and relevance of its responses.

Use Cases

Mem0 empowers organizations and individuals to enhance:

  • AI Assistants and agents: Seamless conversations with a touch of déjà vu
  • Personalized Learning: Tailored content recommendations and progress tracking
  • Customer Support: Context-aware assistance with user preference memory
  • Healthcare: Patient history and treatment plan management
  • Virtual Companions: Deeper user relationships through conversation memory
  • Productivity: Streamlined workflows based on user habits and task history
  • Gaming: Adaptive environments reflecting player choices and progress

Get Started

The easiest way to set up Mem0 is through the managed Mem0 Platform. This hosted solution offers automatic updates, advanced analytics, and dedicated support. Sign up to get started.

If you prefer to self-host, use the open-source Mem0 package. Follow the installation instructions to get started.

Installation Instructions

Install the Mem0 package via pip:

pip install mem0ai

Alternatively, you can use Mem0 with one click on the hosted platform here.

Basic Usage

Mem0 requires an LLM to function, with gpt-4o from OpenAI as the default. However, it supports a variety of LLMs; for details, refer to our Supported LLMs documentation.

First step is to instantiate the memory:

from mem0 import Memory

m = Memory()
How to set OPENAI_API_KEY
import os
os.environ["OPENAI_API_KEY"] = "sk-xxx"

You can perform the following task on the memory:

  1. Add: Store a memory from any unstructured text
  2. Update: Update memory of a given memory_id
  3. Search: Fetch memories based on a query
  4. Get: Return memories for a certain user/agent/session
  5. History: Describe how a memory has changed over time for a specific memory ID
# 1. Add: Store a memory from any unstructured text
result = m.add("I am working on improving my tennis skills. Suggest some online courses.", user_id="alice", metadata={"category": "hobbies"})

# Created memory --> 'Improving her tennis skills.' and 'Looking for online suggestions.'
# 2. Update: update the memory
result = m.update(memory_id=<memory_id_1>, data="Likes to play tennis on weekends")

# Updated memory --> 'Likes to play tennis on weekends.' and 'Looking for online suggestions.'
# 3. Search: search related memories
related_memories = m.search(query="What are Alice's hobbies?", user_id="alice")

# Retrieved memory --> 'Likes to play tennis on weekends'
# 4. Get all memories
all_memories = m.get_all()
memory_id = all_memories["memories"][0] ["id"] # get a memory_id

# All memory items --> 'Likes to play tennis on weekends.' and 'Looking for online suggestions.'
# 5. Get memory history for a particular memory_id
history = m.history(memory_id=<memory_id_1>)

# Logs corresponding to memory_id_1 --> {'prev_value': 'Working on improving tennis skills and interested in online courses for tennis.', 'new_value': 'Likes to play tennis on weekends' }

[!TIP] If you prefer a hosted version without the need to set up infrastructure yourself, check out the Mem0 Platform to get started in minutes.

Graph Memory

To initialize Graph Memory you'll need to set up your configuration with graph store providers. Currently, we support Neo4j as a graph store provider. You can setup Neo4j locally or use the hosted Neo4j AuraDB. Moreover, you also need to set the version to v1.1 (prior versions are not supported). Here's how you can do it:

from mem0 import Memory

config = {
    "graph_store": {
        "provider": "neo4j",
        "config": {
            "url": "neo4j+s://xxx",
            "username": "neo4j",
            "password": "xxx"
        }
    },
    "version": "v1.1"
}

m = Memory.from_config(config_dict=config)

Documentation

For detailed usage instructions and API reference, visit our documentation at docs.mem0.ai. Here, you can find more information on both the open-source version and the hosted Mem0 Platform.

Star History

Star History Chart

Support

Join our community for support and discussions. If you have any questions, feel free to reach out to us using one of the following methods:

Contributors

Join our Discord community to learn about memory management for AI agents and LLMs, and connect with Mem0 users and contributors. Share your ideas, questions, or feedback in our GitHub Issues.

We value and appreciate the contributions of our community. Special thanks to our contributors for helping us improve Mem0.

Anonymous Telemetry

We collect anonymous usage metrics to enhance our package's quality and user experience. This includes data like feature usage frequency and system info, but never personal details. The data helps us prioritize improvements and ensure compatibility. If you wish to opt-out, set the environment variable MEM0_TELEMETRY=false. We prioritize data security and don't share this data externally.

License

This project is licensed under the Apache 2.0 License - see the LICENSE file for details.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

mem0ai-0.1.30.tar.gz (57.1 kB view details)

Uploaded Source

Built Distribution

mem0ai-0.1.30-py3-none-any.whl (82.8 kB view details)

Uploaded Python 3

File details

Details for the file mem0ai-0.1.30.tar.gz.

File metadata

  • Download URL: mem0ai-0.1.30.tar.gz
  • Upload date:
  • Size: 57.1 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/5.1.1 CPython/3.12.4

File hashes

Hashes for mem0ai-0.1.30.tar.gz
Algorithm Hash digest
SHA256 62a29212852cf9bc77e2965b86fe78562bdffcdebe7f429ed744bee988c902b0
MD5 302560d485c46c0ec21d3f65cb0d976a
BLAKE2b-256 528ad507bbfcbfdba2718b307ec1af556d6732964b8de98554f8b116a50ab505

See more details on using hashes here.

File details

Details for the file mem0ai-0.1.30-py3-none-any.whl.

File metadata

  • Download URL: mem0ai-0.1.30-py3-none-any.whl
  • Upload date:
  • Size: 82.8 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/5.1.1 CPython/3.12.4

File hashes

Hashes for mem0ai-0.1.30-py3-none-any.whl
Algorithm Hash digest
SHA256 cdb6c8bb792874b76898d2d1af6691b355f7b035ed5c1a286d4b2aaf136c29c6
MD5 0ceb9baf53a57bfab8f818b00322ff9b
BLAKE2b-256 54f15fe588cb566110b58261bf461df5daab03ce263b3050ad0fbf3df7a47841

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page