Skip to main content

A Model Context Protocol (MCP) server that enables secure interaction with OceanBase databases. This server allows AI assistants to list tables, read data, and execute SQL queries through a controlled interface, making database exploration and analysis safer and more structured.

Project description

OceanBase MCP Server

A Model Context Protocol (MCP) server that enables secure interaction with OceanBase databases. This server allows AI assistants to list tables, read data, and execute SQL queries through a controlled interface, making database exploration and analysis safer and more structured.

Features

  • List available OceanBase tables as resources
  • Read table contents
  • Execute SQL queries with proper error handling
  • Secure database access through environment variables
  • Comprehensive logging

Tools

  • [✔️] Execute SQL queries
  • [✔️] Get current tenant
  • [✔️] Get all server nodes (sys tenant only)
  • [✔️] Get resource capacity (sys tenant only)
  • [✔️] Get ASH report
  • [✔️] Search OceanBase document from official website. This tool is experimental because the API on the official website may change.

Install from PyPI Repository

Install the Python package manager uv and create virtual environment

curl -LsSf https://astral.sh/uv/install.sh | sh
uv venv
source .venv/bin/activate  # or `.venv\Scripts\activate` on Windows

If the dependency packages cannot be downloaded via uv due to network issues, you can change the mirror source to the Alibaba Cloud mirror source.

export UV_DEFAULT_INDEX="https://mirrors.aliyun.com/pypi/simple/"

Install OceanBase MCP Server

uv pip install oceanbase-mcp

Configuration

There are two ways to configure the connection information of OceanBase

  1. Set the following environment variables:
export OB_HOST=localhost     # Database host
export OB_PORT=2881         # Optional: Database port (defaults to 2881 if not specified)
export OB_USER=your_username
export OB_PASSWORD=your_password
export OB_DATABASE=your_database
  1. Configure in the .env file Create an .env file in the directory where the OceanBase MCP Server command is executed, and fill in the following information OB_HOST=localhost # Database host OB_PORT=2881 # Optional: Database port (defaults to 2881 if not specified) OB_USER=your_username OB_PASSWORD=your_password OB_DATABASE=your_database

Usage

stdio Mode

Add the following content to the configuration file that supports the MCP server client:

{
  "mcpServers": {
    "oceanbase": {
      "command": "uvx",
      "args": [
        "oceanbase-mcp"
      ],
      "env": {
        "OB_HOST": "localhost",
        "OB_PORT": "2881",
        "OB_USER": "your_username",
        "OB_PASSWORD": "your_password",
        "OB_DATABASE": "your_database"
      }
    }
  }
}

sse Mode

Within the mcp-oceanbase directory, execute the following command, the port can be customized as desired.
'--transport': MCP server transport type as stdio or sse, default is stdio
'--host': sse Host to bind to, default is 127.0.0.1, that is to say, you can only access it on your local computer. If you want any remote client to be able to access it, you can set the host to 0.0.0.0
'--port': sse port to listen on, default is 8000

oceanbase_mcp_server --transport sse --port 8000

The URL address for the general SSE mode configuration is http://ip:port/sse

🧠 AI Memory System

Experimental Feature: Transform your AI assistant with persistent vector-based memory powered by OceanBase's advanced vector capabilities.

The memory system enables your AI to maintain continuous context across conversations, eliminating the need to repeat personal preferences and information. Four intelligent tools work together to create a seamless memory experience:

  • ob_memory_query - Semantically search and retrieve contextual memories
  • ob_memory_insert - Automatically capture and store important conversations
  • ob_memory_delete - Remove outdated or unwanted memories
  • ob_memory_update - Evolve memories with new information over time

🚀 Quick Setup

Memory tools are disabled by default to avoid the initial embedding model download (0.5~4 GiB). Install extra dependences are necessary.

cd mcp-oceanbase/src/oceanbase_mcp_server
uv sync --extra memory

Enable intelligent memory with these environment variables:

ENABLE_MEMORY=1  # default 0 disabled, set 1 to enable
EMBEDDING_MODEL_NAME=BAAI/bge-small-en-v1.5 # default BAAI/bge-small-en-v1.5, You can set BAAI/bge-m3 or other models to get better experience.
EMBEDDING_MODEL_PROVIDER=huggingface

📋 Prerequisites

Vector Support: Requires OceanBase v4.3.5.3+ (vector features enabled by default)

sudo docker run -p 2881:2881 --name obvector -e MODE=mini -d oceanbase/oceanbase-ce:4.3.5.3-103000092025080818

Legacy Versions: For older OceanBase versions, manually configure ob_vector_memory_limit_percentage.

💡 Usage Example

Experience the power of cross-session intelligent memory:

📅 Monday Conversation
User: "I love football and basketball, but I don't like swimming. I work in Shanghai using Python."
AI: "Got it! I've saved your preferences and work information!" 
    💾 [Automatically calls ob_memory_insert to save preference data]

📅 Wednesday Conversation  
User: "Recommend some sports I might be interested in"
AI: 🔍 [Automatically calls ob_memory_query searching "sports preferences"]
    "Based on your previous preferences, I recommend football and basketball activities! 
     Since you mentioned not liking swimming, here are some great land-based sports..."

📅 One Week Later
User: "Where do I work and what programming language do I use?"  
AI: 🔍 [Automatically calls ob_memory_query searching "work programming"]
    "You work in Shanghai and primarily use Python for development."

🎯 Memory System Benefits:

  • Cross-Session Continuity - No need to reintroduce yourself
  • Intelligent Semantic Search - Understands related concepts and context
  • Personalized Experience - AI truly "knows" your preferences
  • Automatic Capture - Important information saved without manual effort

Security Considerations

  • Use a database user with minimal required permissions
  • Consider implementing query whitelisting for production use
  • Monitor and log all database operations

Security Best Practices

This MCP server requires database access to function. For security:

  1. Create a dedicated OceanBase user with minimal permissions
  2. Never use root credentials or administrative accounts
  3. Restrict database access to only necessary operations
  4. Enable logging for audit purposes
  5. Regular security reviews of database access

See OceanBase Security Configuration Guide for detailed instructions on:

  • Creating a restricted OceanBase user
  • Setting appropriate permissions
  • Monitoring database access
  • Security best practices

⚠️ IMPORTANT: Always follow the principle of least privilege when configuring database access.

License

Apache License - see LICENSE file for details.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

david_oceanbase_mcp_2-0.0.1.tar.gz (22.8 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

david_oceanbase_mcp_2-0.0.1-py3-none-any.whl (22.1 kB view details)

Uploaded Python 3

File details

Details for the file david_oceanbase_mcp_2-0.0.1.tar.gz.

File metadata

  • Download URL: david_oceanbase_mcp_2-0.0.1.tar.gz
  • Upload date:
  • Size: 22.8 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.1.0 CPython/3.12.3

File hashes

Hashes for david_oceanbase_mcp_2-0.0.1.tar.gz
Algorithm Hash digest
SHA256 04758be60f9976269c5db35ae8c60b2490983267756a4eef8a6e20ad129f46f1
MD5 254753dbd0bcccc87c34eb130928f6d2
BLAKE2b-256 cb14e30981427b95c3f22858c741ef728a9ae608b5cbf30ebc267e88d3d4a15e

See more details on using hashes here.

File details

Details for the file david_oceanbase_mcp_2-0.0.1-py3-none-any.whl.

File metadata

File hashes

Hashes for david_oceanbase_mcp_2-0.0.1-py3-none-any.whl
Algorithm Hash digest
SHA256 f41bd2516f72f331a71adcadbd95a3482a8e9c799746b96878056c4b64c2a153
MD5 ae3b6e35aeb927c678903f3dfe09134d
BLAKE2b-256 5cffa1c3ce3b82a0d140d3091ea38c4432b541593da3911f4b08fc15788a636f

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page