Skip to main content

Open-source MCP server for mem0 - local LLMs, self-hosted, Docker-free

Project description

mem0-open-mcp

Open-source MCP server for mem0local LLMs, self-hosted, Docker-free.

Created because the official mem0-mcp configuration wasn't working properly for my setup.

Features

  • Local LLMs: Ollama (recommended), LMStudio*, or any OpenAI-compatible API
  • Self-hosted: Your data stays on your infrastructure
  • Docker-free: Simple pip install + CLI
  • Flexible: YAML config with environment variable support
  • Multiple Vector Stores: Qdrant, Chroma, Pinecone, and more

*LMStudio requires JSON mode compatible models

Quick Start

Installation

pip install mem0-open-mcp

Or install from source:

git clone https://github.com/wonseoko/mem0-open-mcp.git
cd mem0-open-mcp
pip install -e .

Usage

# Create default config
mem0-open-mcp init

# Interactive configuration wizard
mem0-open-mcp configure

# Test configuration (recommended for initial setup)
mem0-open-mcp test

# Start the server
mem0-open-mcp serve

# With options
mem0-open-mcp serve --port 8765 --user-id alice

The test command verifies your configuration without starting the server:

  • Checks Vector Store, LLM, and Embedder connections
  • Performs actual memory add/search operations
  • Cleans up test data automatically

Configuration

Create mem0-open-mcp.yaml:

server:
  host: "0.0.0.0"
  port: 8765
  user_id: "default"

llm:
  provider: "ollama"
  config:
    model: "llama3.2"
    base_url: "http://localhost:11434"

embedder:
  provider: "ollama"
  config:
    model: "nomic-embed-text"
    base_url: "http://localhost:11434"
    embedding_dims: 768

vector_store:
  provider: "qdrant"
  config:
    collection_name: "mem0_memories"
    host: "localhost"
    port: 6333
    embedding_model_dims: 768

With LMStudio

⚠️ Note: LMStudio requires a model that supports response_format: json_object. mem0 uses structured JSON output for memory extraction. If you get response_format errors, use Ollama instead or select a model with JSON mode support in LMStudio.

llm:
  provider: "openai"
  config:
    model: "your-model-name"
    base_url: "http://localhost:1234/v1"

embedder:
  provider: "openai"
  config:
    model: "your-embedding-model"
    base_url: "http://localhost:1234/v1"

MCP Integration

Connect your MCP client to:

http://localhost:8765/mcp/<client-name>/sse/<user-id>

Claude Desktop

{
  "mcpServers": {
    "mem0": {
      "url": "http://localhost:8765/mcp/claude/sse/default"
    }
  }
}

Available MCP Tools

Tool Description
add_memories Store new memories from text
search_memory Search memories by query
list_memories List all user memories
get_memory Get a specific memory by ID
delete_memories Delete memories by IDs
delete_all_memories Delete all user memories

API Endpoints

Endpoint Method Description
/health GET Health check
/api/v1/status GET Server status
/api/v1/config GET/PUT Configuration
/api/v1/memories GET/POST/DELETE Memory operations
/api/v1/memories/search POST Search memories

Requirements

  • Python 3.10+
  • Vector store (Qdrant recommended)
  • LLM server (Ollama, LMStudio, etc.)

Graph Store (Experimental)

Graph store enables knowledge graph capabilities for relationship extraction between entities.

Configuration

graph_store:
  provider: "neo4j"
  config:
    url: "bolt://localhost:7687"
    username: "neo4j"
    password: "your-password"

Installation

pip install mem0-open-mcp[neo4j]
# or
pip install mem0-open-mcp[kuzu]

Limitations

⚠️ Important: Graph store requires LLMs with proper tool calling support.

  • OpenAI models: Full support (recommended for graph store)
  • Ollama models: Limited support - most models (llama3.2, llama3.1) do not follow tool schemas accurately, resulting in empty graph relations

If you need graph capabilities with local LLMs, consider using the graph_store.llm setting to specify a different LLM provider for graph operations only.

# Example: Use OpenAI for graph, Ollama for everything else
llm:
  provider: "ollama"
  config:
    model: "llama3.2"

graph_store:
  provider: "neo4j"
  config:
    url: "bolt://localhost:7687"
    username: "neo4j"
    password: "password"
  llm:
    provider: "openai"
    config:
      model: "gpt-4o-mini"

License

Apache 2.0

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

mem0_open_mcp-0.2.0.tar.gz (270.5 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

mem0_open_mcp-0.2.0-py3-none-any.whl (27.5 kB view details)

Uploaded Python 3

File details

Details for the file mem0_open_mcp-0.2.0.tar.gz.

File metadata

  • Download URL: mem0_open_mcp-0.2.0.tar.gz
  • Upload date:
  • Size: 270.5 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.13.7

File hashes

Hashes for mem0_open_mcp-0.2.0.tar.gz
Algorithm Hash digest
SHA256 62aa8d8bb78f678c605088960c95349a414986c6892007b4f44fedbf007d03ef
MD5 a398ed05682796151a55432539656327
BLAKE2b-256 80d24019108f2875df8e84ca237a21c5e7add5c3a38fe5117f838f14b42020e8

See more details on using hashes here.

Provenance

The following attestation bundles were made for mem0_open_mcp-0.2.0.tar.gz:

Publisher: publish.yml on wonseoko/mem0-open-mcp

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file mem0_open_mcp-0.2.0-py3-none-any.whl.

File metadata

  • Download URL: mem0_open_mcp-0.2.0-py3-none-any.whl
  • Upload date:
  • Size: 27.5 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.13.7

File hashes

Hashes for mem0_open_mcp-0.2.0-py3-none-any.whl
Algorithm Hash digest
SHA256 cc65c7b1db37eb979d6db2f78235ec945ecf47a6ce881b661b1bdbe38b1a511e
MD5 6bf44ff82949c0aed0e50dbd63f6712c
BLAKE2b-256 11a6ea3bf38fd95e139612713196fcb230956e0ef51ff0d988ee5f1ddc99f92d

See more details on using hashes here.

Provenance

The following attestation bundles were made for mem0_open_mcp-0.2.0-py3-none-any.whl:

Publisher: publish.yml on wonseoko/mem0-open-mcp

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page