Skip to main content

MCP server providing database exploration and query tools for AI assistants

Project description

Database Tools MCP Server

An MCP (Model Context Protocol) server that provides database exploration and query tools for AI assistants. This server enables safe SQL querying and database schema exploration with built-in security features.

Features

  • Table List Explorer: List all tables and views in a database
  • Table Details Explorer: Get detailed schema information and sample data for specific tables
  • Query Engine: Execute read-only SQL queries with safety features and timeouts

Installation

No installation required! Use uvx to run directly:

uvx oriona-database-tools

Quick Start

  1. No installation needed - uvx runs the package directly

  2. Set your database URL:

    # PostgreSQL
    export DATABASE_URL="postgresql://user:password@localhost:5432/mydb"
    
    # SQL Server
    export DATABASE_URL="mssql+pyodbc://user:password@localhost:1433/mydb?driver=ODBC+Driver+18+for+SQL+Server"
    
    # MySQL
    export DATABASE_URL="mysql+pymysql://user:password@localhost:3306/mydb"
    
    # SQLite
    export DATABASE_URL="sqlite:///path/to/database.db"
    
  3. Add to Claude Desktop (see configuration below)

Claude Desktop Configuration

Add to your Claude Desktop configuration file:

macOS: ~/Library/Application Support/Claude/claude_desktop_config.json Windows: %APPDATA%\Claude\claude_desktop_config.json

{
  "mcpServers": {
    "database-tools": {
      "command": "uvx",
      "args": ["oriona-database-tools"],
      "env": {
        "DATABASE_URL": "postgresql://user:password@localhost:5432/mydb",
        "TABLE_WHITELIST": "users,orders,products",
        "TABLE_BLACKLIST": "api_keys,secrets"
      }
    }
  }
}

Security Note: Always use a read-only database user for the MCP server to ensure data safety.

Available Tools

1. list_tables

List all tables and views in a database.

Parameters:

  • include_views (boolean, optional): Include database views in the list (default: true)

Example:

{
  "tool": "list_tables",
  "arguments": {
    "include_views": true
  }
}

2. explore_table

Get detailed information about a specific table including schema, foreign keys, and sample data.

Parameters:

  • table_name (string, required): The name of the table to analyze
  • sample_size (integer, optional): Number of sample rows to retrieve (0-100, default: 3)

Example:

{
  "tool": "explore_table",
  "arguments": {
    "table_name": "customers",
    "sample_size": 5
  }
}

3. query_database_readonly

Execute read-only SELECT queries with safety features.

Parameters:

  • query (string, required): The SQL query to execute (SELECT only)
  • timeout_seconds (integer, optional): Maximum query execution time in seconds (default: 30)
  • max_rows (integer, optional): Maximum number of rows to return (0 for unlimited, default: 100)

Example:

{
  "tool": "query_database_readonly",
  "arguments": {
    "query": "SELECT * FROM orders WHERE created_at > '2024-01-01' LIMIT 10",
    "timeout_seconds": 30,
    "max_rows": 100
  }
}

Supported Databases

  • PostgreSQL (recommended)
  • MySQL
  • Microsoft SQL Server
  • SQLite
  • BigQuery (Google Cloud)
  • Any SQLAlchemy-supported database

Security Features

  • Read-only queries: Only SELECT and WITH queries are allowed
  • Query timeout: Configurable timeout to prevent long-running queries
  • Row limits: Default limit of 100 rows per query (configurable)
  • Connection pooling: Efficient connection management with pool recycling
  • URI sanitization: Automatic conversion of legacy postgres:// to postgresql://
  • Table filtering: Whitelist/blacklist specific tables to control access

Environment Variables

Required:

  • DATABASE_URL: The database connection URL (e.g., postgresql://user:pass@localhost:5432/mydb)

Optional:

  • DATABASE_TOOLS_LOG_LEVEL: Set logging level (default: INFO)
  • DATABASE_TOOLS_MAX_CONNECTIONS: Maximum database connections per pool (default: 5)
  • TABLE_WHITELIST: Comma-separated list of allowed tables (e.g., users,orders,products)
  • TABLE_BLACKLIST: Comma-separated list of forbidden tables (e.g., secrets,sensitive_data)

BigQuery-specific (when using BigQuery):

  • BIGQUERY_PROJECT_ID: Google Cloud project ID (required for BigQuery)
  • BIGQUERY_CREDENTIALS_BASE64: Base64 encoded service account JSON (optional, falls back to default credentials)
  • BIGQUERY_DATASET: Default dataset name (optional, if not specified, queries all accessible datasets)
  • BIGQUERY_LOCATION: BigQuery location/region (default: US)

Microsoft SQL Server Setup

Prerequisites

  1. Install ODBC Driver: Download and install the Microsoft ODBC Driver for SQL Server:

  2. Claude Desktop Configuration:

    For local development, ODCB version 17 is recommended as it's more flexible

    {
      "mcpServers": {
        "database-tools": {
          "command": "uvx",
          "args": ["oriona-database-tools"],
          "env": {
            "DATABASE_URL": "mssql+pyodbc://{{uname}}:{{pw}}@localhost:{{port}}/{{db}}?driver=ODBC+Driver+17+for+SQL+Server&TrustServerCertificate=yes"
          }
        }
      }
    }
    

BigQuery Setup

Authentication

Option 1: Base64 Encoded Service Account (Recommended)

  1. Create a service account in Google Cloud Console
  2. Download the JSON key file
  3. Encode to base64:
    base64 -i /path/to/service-account.json
    
  4. Configure Claude Desktop:
    {
      "mcpServers": {
        "database-tools": {
          "command": "uvx",
          "args": ["oriona-database-tools"],
          "env": {
            "BIGQUERY_PROJECT_ID": "your-project-id",
            "BIGQUERY_CREDENTIALS_BASE64": "eyJ0eXBlIjoic2VydmljZV9hY2NvdW50Ii...",
            "BIGQUERY_DATASET": "your_dataset"
          }
        }
      }
    }
    

Option 2: Default Credentials (Fallback)

If you have the Google Cloud CLI installed and authenticated:

gcloud auth application-default login

Then configure without credentials:

{
  "mcpServers": {
    "database-tools": {
      "command": "uvx",
      "args": ["oriona-database-tools"],
      "env": {
        "BIGQUERY_PROJECT_ID": "your-project-id",
        "BIGQUERY_DATASET": "your_dataset"
      }
    }
  }
}

Dataset Configuration

  • Specific dataset: Set BIGQUERY_DATASET to limit access to one dataset
  • All datasets: Leave BIGQUERY_DATASET unset to access all datasets (tables will be prefixed with dataset.table)

Required Permissions

Your service account needs these BigQuery permissions:

  • bigquery.datasets.get
  • bigquery.tables.list
  • bigquery.tables.get
  • bigquery.tables.getData
  • bigquery.jobs.create

Table Filtering

You can control which tables are accessible through the MCP server using whitelist and blacklist configurations:

Whitelist Mode

When TABLE_WHITELIST is set, only the specified tables will be accessible:

{
  "env": {
    "DATABASE_URL": "postgresql://user:pass@localhost:5432/mydb",
    "TABLE_WHITELIST": "users,orders,products"
  }
}

In this example, only users, orders, and products tables can be accessed.

Blacklist Mode

When TABLE_BLACKLIST is set, the specified tables will be blocked:

{
  "env": {
    "DATABASE_URL": "postgresql://user:pass@localhost:5432/mydb",
    "TABLE_BLACKLIST": "api_keys,user_passwords,audit_logs"
  }
}

In this example, api_keys, user_passwords, and audit_logs tables cannot be accessed.

Filtering Behavior

  • Filtered tables won't appear in list_tables results
  • Attempts to access filtered tables with explore_table will return an access denied error
  • Table names are case-insensitive (e.g., Users and users are treated the same)

Error Handling

The server returns structured error responses:

{
  "error": "Error message",
  "error_type": "ExceptionType",
  "recommendation": "Suggested action"
}

Common errors:

  • Table/column not found: Check table names with list_tables
  • Query timeout: Reduce query complexity or increase timeout
  • Permission denied: Verify database credentials

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

oriona_database_tools-0.4.4.tar.gz (143.2 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

oriona_database_tools-0.4.4-py3-none-any.whl (17.2 kB view details)

Uploaded Python 3

File details

Details for the file oriona_database_tools-0.4.4.tar.gz.

File metadata

  • Download URL: oriona_database_tools-0.4.4.tar.gz
  • Upload date:
  • Size: 143.2 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.1.0 CPython/3.13.5

File hashes

Hashes for oriona_database_tools-0.4.4.tar.gz
Algorithm Hash digest
SHA256 56272e69862fb3c748f3f9c62ae6b28a93ceac9b788f28d6ff5b5eb130f0090b
MD5 0f970521bba8112bfd8d1f97d3b95c4f
BLAKE2b-256 3bb181cea1d5189dc785a659739349599057b32aa8c71dfa05aa3bafc9d53355

See more details on using hashes here.

File details

Details for the file oriona_database_tools-0.4.4-py3-none-any.whl.

File metadata

File hashes

Hashes for oriona_database_tools-0.4.4-py3-none-any.whl
Algorithm Hash digest
SHA256 4154dafbd93f46bd85fcda708dba4980e8eff4d9c4ecbc9aed258ed7498499c0
MD5 7d0e38589b1563d699373ef8e9b089e3
BLAKE2b-256 9e495cfd3a569eaf43ef7dc4279a0edc3506638868f673f4c42e247940e499f6

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page