Skip to main content

MCP server for integrating long term memory into AI agents with Mem0

Project description

MCP-Mem0: Long-Term Memory for AI Agents

Mem0 and MCP Integration

A template implementation of the Model Context Protocol (MCP) server integrated with Mem0 for providing AI agents with persistent memory capabilities.

Use this as a reference point to build your MCP servers yourself, or give this as an example to an AI coding assistant and tell it to follow this example for structure and code correctness!

Overview

This project demonstrates how to build an MCP server that enables AI agents to store, retrieve, and search memories using semantic search. It serves as a practical template for creating your own MCP servers, simply using Mem0 and a practical example.

The implementation follows the best practices laid out by Anthropic for building MCP servers, allowing seamless integration with any MCP-compatible client.

Features

The server provides three essential memory management tools:

  1. save_memory: Store any information in long-term memory with semantic indexing
  2. get_all_memories: Retrieve all stored memories for comprehensive context
  3. search_memories: Find relevant memories using semantic search

Prerequisites

  • Python 3.12+
  • Supabase or any PostgreSQL database (for vector storage of memories)
  • API keys for your chosen LLM provider (OpenAI, OpenRouter, or Ollama)
  • Docker if running the MCP server as a container (recommended)

Installation

Using uv

  1. Install uv if you don't have it:

    pip install uv
    
  2. Clone this repository:

    git clone https://github.com/coleam00/mcp-mem0.git
    cd mcp-mem0
    
  3. Install dependencies:

    uv pip install -e .
    
  4. Create a .env file based on .env.example:

    cp .env.example .env
    
  5. Configure your environment variables in the .env file (see Configuration section)

Using Docker (Recommended)

  1. Build the Docker image:

    docker build -t mcp/mem0 --build-arg PORT=8050 .
    
  2. Create a .env file based on .env.example and configure your environment variables

Configuration

The following environment variables can be configured in your .env file:

Variable Description Example
TRANSPORT Transport protocol (sse or stdio) sse
HOST Host to bind to when using SSE transport 0.0.0.0
PORT Port to listen on when using SSE transport 8050
LLM_PROVIDER LLM provider (openai, openrouter, or ollama) openai
LLM_BASE_URL Base URL for the LLM API https://api.openai.com/v1
LLM_API_KEY API key for the LLM provider sk-...
LLM_CHOICE LLM model to use gpt-4o-mini
EMBEDDING_MODEL_CHOICE Embedding model to use text-embedding-3-small
DATABASE_URL PostgreSQL connection string postgresql://user:pass@host:port/db

Running the Server

Using uv

SSE Transport

# Set TRANSPORT=sse in .env then:
uv run src/main.py

The MCP server will essentially be run as an API endpoint that you can then connect to with config shown below.

Stdio Transport

With stdio, the MCP client iself can spin up the MCP server, so nothing to run at this point.

Using Docker

SSE Transport

docker run --env-file .env -p:8050:8050 mcp/mem0

The MCP server will essentially be run as an API endpoint within the container that you can then connect to with config shown below.

Stdio Transport

With stdio, the MCP client iself can spin up the MCP server container, so nothing to run at this point.

Integration with MCP Clients

SSE Configuration

Once you have the server running with SSE transport, you can connect to it using this configuration:

{
  "mcpServers": {
    "mem0": {
      "transport": "sse",
      "url": "http://localhost:8050/sse"
    }
  }
}

Note for Windsurf users: Use serverUrl instead of url in your configuration:

{
  "mcpServers": {
    "mem0": {
      "transport": "sse",
      "serverUrl": "http://localhost:8050/sse"
    }
  }
}

Note for n8n users: Use host.docker.internal instead of localhost since n8n has to reach outside of it's own container to the host machine:

So the full URL in the MCP node would be: http://host.docker.internal:8050/sse

Make sure to update the port if you are using a value other than the default 8050.

Python with Stdio Configuration

Add this server to your MCP configuration for Claude Desktop, Windsurf, or any other MCP client:

{
  "mcpServers": {
    "mem0": {
      "command": "your/path/to/mcp-mem0/.venv/Scripts/python.exe",
      "args": ["your/path/to/mcp-mem0/src/main.py"],
      "env": {
        "TRANSPORT": "stdio",
        "LLM_PROVIDER": "openai",
        "LLM_BASE_URL": "https://api.openai.com/v1",
        "LLM_API_KEY": "YOUR-API-KEY",
        "LLM_CHOICE": "gpt-4o-mini",
        "EMBEDDING_MODEL_CHOICE": "text-embedding-3-small",
        "DATABASE_URL": "YOUR-DATABASE-URL"
      }
    }
  }
}

Docker with Stdio Configuration

{
  "mcpServers": {
    "mem0": {
      "command": "docker",
      "args": ["run", "--rm", "-i", 
               "-e", "TRANSPORT", 
               "-e", "LLM_PROVIDER", 
               "-e", "LLM_BASE_URL", 
               "-e", "LLM_API_KEY", 
               "-e", "LLM_CHOICE", 
               "-e", "EMBEDDING_MODEL_CHOICE", 
               "-e", "DATABASE_URL", 
               "mcp/mem0"],
      "env": {
        "TRANSPORT": "stdio",
        "LLM_PROVIDER": "openai",
        "LLM_BASE_URL": "https://api.openai.com/v1",
        "LLM_API_KEY": "YOUR-API-KEY",
        "LLM_CHOICE": "gpt-4o-mini",
        "EMBEDDING_MODEL_CHOICE": "text-embedding-3-small",
        "DATABASE_URL": "YOUR-DATABASE-URL"
      }
    }
  }
}

Building Your Own Server

This template provides a foundation for building more complex MCP servers. To build your own:

  1. Add your own tools by creating methods with the @mcp.tool() decorator
  2. Create your own lifespan function to add your own dependencies (clients, database connections, etc.)
  3. Modify the utils.py file for any helper functions you need for your MCP server
  4. Feel free to add prompts and resources as well with @mcp.resource() and @mcp.prompt()

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

iflow_mcp_mem0_mcp-0.1.1.tar.gz (6.8 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

iflow_mcp_mem0_mcp-0.1.1-py3-none-any.whl (7.7 kB view details)

Uploaded Python 3

File details

Details for the file iflow_mcp_mem0_mcp-0.1.1.tar.gz.

File metadata

  • Download URL: iflow_mcp_mem0_mcp-0.1.1.tar.gz
  • Upload date:
  • Size: 6.8 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: uv/0.9.10 {"installer":{"name":"uv","version":"0.9.10"},"python":null,"implementation":{"name":null,"version":null},"distro":{"name":"macOS","version":null,"id":null,"libc":null},"system":{"name":null,"release":null},"cpu":null,"openssl_version":null,"setuptools_version":null,"rustc_version":null,"ci":null}

File hashes

Hashes for iflow_mcp_mem0_mcp-0.1.1.tar.gz
Algorithm Hash digest
SHA256 d18a50eb6a3a1788027556c4c2ad66b784540ced8f82e9c13112eb42c61eccb8
MD5 8aac6ea03af2ed608e470f78ffd6d606
BLAKE2b-256 087aa56602969e36ba765860562d0fe087c42e82a135719fe753f8897ab2a151

See more details on using hashes here.

File details

Details for the file iflow_mcp_mem0_mcp-0.1.1-py3-none-any.whl.

File metadata

  • Download URL: iflow_mcp_mem0_mcp-0.1.1-py3-none-any.whl
  • Upload date:
  • Size: 7.7 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: uv/0.9.10 {"installer":{"name":"uv","version":"0.9.10"},"python":null,"implementation":{"name":null,"version":null},"distro":{"name":"macOS","version":null,"id":null,"libc":null},"system":{"name":null,"release":null},"cpu":null,"openssl_version":null,"setuptools_version":null,"rustc_version":null,"ci":null}

File hashes

Hashes for iflow_mcp_mem0_mcp-0.1.1-py3-none-any.whl
Algorithm Hash digest
SHA256 d26f516a0a1d03dbe90fc84b4c4e1d6d56d8035623f3954c9b41850cc72a028e
MD5 d9dc3bdba0cea3c19c416baa7f520828
BLAKE2b-256 6e05e268a80a3d66ad81670da988a9548e7b61bae8eae6dce4d873415c2d3705

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page