Skip to main content

A Model Context Protocol (MCP) server that enables secure interaction with OceanBase databases. This server allows AI assistants to list tables, read data, and execute SQL queries through a controlled interface, making database exploration and analysis safer and more structured.

Project description

OceanBase MCP Server

A Model Context Protocol (MCP) server that enables secure interaction with OceanBase databases. This server allows AI assistants to list tables, read data, and execute SQL queries through a controlled interface, making database exploration and analysis safer and more structured.

Features

  • List available OceanBase tables as resources
  • Read table contents
  • Execute SQL queries with proper error handling
  • Secure database access through environment variables
  • Comprehensive logging

Tools

  • [✔️] Execute SQL queries
  • [✔️] Get current tenant
  • [✔️] Get all server nodes (sys tenant only)
  • [✔️] Get resource capacity (sys tenant only)
  • [✔️] Get ASH report
  • [✔️] Search OceanBase document from official website. This tool is experimental because the API on the official website may change.

Install from PyPI Repository

Install the Python package manager uv and create virtual environment

curl -LsSf https://astral.sh/uv/install.sh | sh
uv venv
source .venv/bin/activate  # or `.venv\Scripts\activate` on Windows

If the dependency packages cannot be downloaded via uv due to network issues, you can change the mirror source to the Alibaba Cloud mirror source.

export UV_DEFAULT_INDEX="https://mirrors.aliyun.com/pypi/simple/"

Install OceanBase MCP Server

uv pip install oceanbase-mcp

Configuration

There are two ways to configure the connection information of OceanBase

  1. Set the following environment variables:
export OB_HOST=localhost     # Database host
export OB_PORT=2881         # Optional: Database port (defaults to 2881 if not specified)
export OB_USER=your_username
export OB_PASSWORD=your_password
export OB_DATABASE=your_database
  1. Configure in the .env file Create an .env file in the directory where the OceanBase MCP Server command is executed, and fill in the following information OB_HOST=localhost # Database host OB_PORT=2881 # Optional: Database port (defaults to 2881 if not specified) OB_USER=your_username OB_PASSWORD=your_password OB_DATABASE=your_database

Usage

stdio Mode

Add the following content to the configuration file that supports the MCP server client:

{
  "mcpServers": {
    "oceanbase": {
      "command": "uvx",
      "args": [
        "oceanbase-mcp"
      ],
      "env": {
        "OB_HOST": "localhost",
        "OB_PORT": "2881",
        "OB_USER": "your_username",
        "OB_PASSWORD": "your_password",
        "OB_DATABASE": "your_database"
      }
    }
  }
}

sse Mode

Within the mcp-oceanbase directory, execute the following command, the port can be customized as desired.
'--transport': MCP server transport type as stdio or sse, default is stdio
'--host': sse Host to bind to, default is 127.0.0.1, that is to say, you can only access it on your local computer. If you want any remote client to be able to access it, you can set the host to 0.0.0.0
'--port': sse port to listen on, default is 8000

oceanbase_mcp_server --transport sse --port 8000

The URL address for the general SSE mode configuration is http://ip:port/sse

🧠 AI Memory System

Experimental Feature: Transform your AI assistant with persistent vector-based memory powered by OceanBase's advanced vector capabilities.

The memory system enables your AI to maintain continuous context across conversations, eliminating the need to repeat personal preferences and information. Four intelligent tools work together to create a seamless memory experience:

  • ob_memory_query - Semantically search and retrieve contextual memories
  • ob_memory_insert - Automatically capture and store important conversations
  • ob_memory_delete - Remove outdated or unwanted memories
  • ob_memory_update - Evolve memories with new information over time

🚀 Quick Setup

Memory tools are disabled by default to avoid the initial embedding model download (0.5~4 GiB). Install extra dependences are necessary.

cd mcp-oceanbase/src/oceanbase_mcp_server
uv sync --extra memory

Enable intelligent memory with these environment variables:

ENABLE_MEMORY=1  # default 0 disabled, set 1 to enable
EMBEDDING_MODEL_NAME=BAAI/bge-small-en-v1.5 # default BAAI/bge-small-en-v1.5, You can set BAAI/bge-m3 or other models to get better experience.
EMBEDDING_MODEL_PROVIDER=huggingface

📋 Prerequisites

Vector Support: Requires OceanBase v4.3.5.3+ (vector features enabled by default)

sudo docker run -p 2881:2881 --name obvector -e MODE=mini -d oceanbase/oceanbase-ce:4.3.5.3-103000092025080818

Legacy Versions: For older OceanBase versions, manually configure ob_vector_memory_limit_percentage.

💡 Usage Example

Experience the power of cross-session intelligent memory:

📅 Monday Conversation
User: "I love football and basketball, but I don't like swimming. I work in Shanghai using Python."
AI: "Got it! I've saved your preferences and work information!" 
    💾 [Automatically calls ob_memory_insert to save preference data]

📅 Wednesday Conversation  
User: "Recommend some sports I might be interested in"
AI: 🔍 [Automatically calls ob_memory_query searching "sports preferences"]
    "Based on your previous preferences, I recommend football and basketball activities! 
     Since you mentioned not liking swimming, here are some great land-based sports..."

📅 One Week Later
User: "Where do I work and what programming language do I use?"  
AI: 🔍 [Automatically calls ob_memory_query searching "work programming"]
    "You work in Shanghai and primarily use Python for development."

🎯 Memory System Benefits:

  • Cross-Session Continuity - No need to reintroduce yourself
  • Intelligent Semantic Search - Understands related concepts and context
  • Personalized Experience - AI truly "knows" your preferences
  • Automatic Capture - Important information saved without manual effort

Security Considerations

  • Use a database user with minimal required permissions
  • Consider implementing query whitelisting for production use
  • Monitor and log all database operations

Security Best Practices

This MCP server requires database access to function. For security:

  1. Create a dedicated OceanBase user with minimal permissions
  2. Never use root credentials or administrative accounts
  3. Restrict database access to only necessary operations
  4. Enable logging for audit purposes
  5. Regular security reviews of database access

See OceanBase Security Configuration Guide for detailed instructions on:

  • Creating a restricted OceanBase user
  • Setting appropriate permissions
  • Monitoring database access
  • Security best practices

⚠️ IMPORTANT: Always follow the principle of least privilege when configuring database access.

License

Apache License - see LICENSE file for details.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

david_oceanbase_mcp-0.0.3.tar.gz (22.7 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

david_oceanbase_mcp-0.0.3-py3-none-any.whl (22.1 kB view details)

Uploaded Python 3

File details

Details for the file david_oceanbase_mcp-0.0.3.tar.gz.

File metadata

  • Download URL: david_oceanbase_mcp-0.0.3.tar.gz
  • Upload date:
  • Size: 22.7 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.1.0 CPython/3.12.3

File hashes

Hashes for david_oceanbase_mcp-0.0.3.tar.gz
Algorithm Hash digest
SHA256 1f39bfea2c7fd7fda72642ca935b251869dbb98db9fff965d88aa3d7213f2f2b
MD5 c5f385c1d9e3725e9aa5b6b51a1b26e6
BLAKE2b-256 5de572833b3567b4569cfe5c64a6183a36555b183c20662d39899ba85561030b

See more details on using hashes here.

File details

Details for the file david_oceanbase_mcp-0.0.3-py3-none-any.whl.

File metadata

File hashes

Hashes for david_oceanbase_mcp-0.0.3-py3-none-any.whl
Algorithm Hash digest
SHA256 548d7056c651ab1420b4bba9ef56747d7c6e1d508b7de91e70740e95afae19d7
MD5 8d5028b8f1d61b0d491cd50accd176b9
BLAKE2b-256 ea5ed32acbc31075f241dd843b123b419d3d6bca4708b5c0fb02a1054944276a

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page