Skip to main content

Simple Snowflake MCP Server to work behind a corporate proxy

Project description

Simple Snowflake MCP server

Simple Snowflake MCP Server to work behind a corporate proxy (because I could not get that in a few minutes with existing servers, but my own server, yup). Still don't know if it's good or not. But it's good enough for now.

Tools

The server exposes the following MCP tools to interact with Snowflake:

  • execute-snowflake-sql: Executes a SQL query on Snowflake and returns the result (list of dictionaries)
  • list-snowflake-warehouses: Lists available Data Warehouses (DWH) on Snowflake
  • list-databases: Lists all accessible Snowflake databases
  • list-views: Lists all views in a database and schema
  • describe-view: Gives details of a view (columns, SQL)
  • query-view: Queries a view with an optional row limit (markdown result)
  • execute-query: Executes a SQL query in read-only mode (SELECT, SHOW, DESCRIBE, EXPLAIN, WITH) or not (if read_only is false), result in markdown format

Quickstart

Install

Claude Desktop

On MacOS: ~/Library/Application\ Support/Claude/claude_desktop_config.json

On Windows: %APPDATA%/Claude/claude_desktop_config.json

Development/Unpublished Servers Configuration
"mcpServers": {
  "simple_snowflake_mcp": {
    "command": "uv",
    "args": [
      "--directory",
      ".", // Use current directory for GitHub
      "run",
      "simple_snowflake_mcp"
    ]
  }
}
Published Servers Configuration
"mcpServers": {
  "simple_snowflake_mcp": {
    "command": "uvx",
    "args": [
      "simple_snowflake_mcp"
    ]
  }
}

Development

Building and Publishing

To prepare the package for distribution:

  1. Sync dependencies and update lockfile:
uv sync
  1. Build package distributions:
uv build

This will create source and wheel distributions in the dist/ directory.

  1. Publish to PyPI:
uv publish

Note: You'll need to set PyPI credentials via environment variables or command flags:

  • Token: --token or UV_PUBLISH_TOKEN
  • Or username/password: --username/UV_PUBLISH_USERNAME and --password/UV_PUBLISH_PASSWORD

Debugging

Since MCP servers run over stdio, debugging can be challenging. For the best debugging experience, we strongly recommend using the MCP Inspector.

You can launch the MCP Inspector via npm with this command:

npx @modelcontextprotocol/inspector uv --directory . run simple-snowflake-mcp

Upon launching, the Inspector will display a URL that you can access in your browser to begin debugging.

New Feature: Snowflake SQL Execution

The server exposes an MCP tool execute-snowflake-sql to execute a SQL query on Snowflake and return the result.

Usage

Call the MCP tool execute-snowflake-sql with a sql argument containing the SQL query to execute. The result will be returned as a list of dictionaries (one per row).

Example:

{
  "name": "execute-snowflake-sql",
  "arguments": { "sql": "SELECT CURRENT_TIMESTAMP;" }
}

The result will be returned in the MCP response.

Installation and configuration in VS Code

  1. Clone the project and install dependencies

    git clone <your-repo>
    cd simple_snowflake_mcp
    python -m venv .venv
    .venv/Scripts/activate  # Windows
    pip install -r requirements.txt  # or `uv sync --dev --all-extras` if available
    
  2. Configure Snowflake access

    • Copy .env.example to .env (or create .env at the root) and fill in your credentials:
      SNOWFLAKE_USER=...
      SNOWFLAKE_PASSWORD=...
      SNOWFLAKE_ACCOUNT=...
      # Optional: SNOWFLAKE_WAREHOUSE  # Optional: Snowflake warehouse name
      # Optional: SNOWFLAKE_DATABASE   # Optional: default database name
      # Optional: SNOWFLAKE_SCHEMA     # Optional: default schema name
      # Optional: MCP_READ_ONLY=true|false  # Optional: true/false to force read-only mode
      
  3. Configure VS Code for MCP debugging

    • The .vscode/mcp.json file is already present:
      {
        "servers": {
          "simple-snowflake-mcp": {
            "type": "stdio",
            "command": ".venv/Scripts/python.exe",
            "args": ["-m", "simple_snowflake_mcp"]
          }
        }
      }
      
    • Open the command palette (Ctrl+Shift+P), type MCP: Start Server and select simple-snowflake-mcp.
  4. Usage

Supported MCP Functions

The server exposes the following MCP tools to interact with Snowflake:

  • execute-snowflake-sql: Executes a SQL query on Snowflake and returns the result (list of dictionaries)
  • list-snowflake-warehouses: Lists available Data Warehouses (DWH) on Snowflake
  • list-databases: Lists all accessible Snowflake databases
  • list-views: Lists all views in a database and schema
  • describe-view: Gives details of a view (columns, SQL)
  • query-view: Queries a view with an optional row limit (markdown result)
  • execute-query: Executes a SQL query in read-only mode (SELECT, SHOW, DESCRIBE, EXPLAIN, WITH) or not (if read_only is false), result in markdown format

For each tool, see the Usage section or the MCP documentation for the call format.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

simple_snowflake_mcp-0.1.0.tar.gz (54.9 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

simple_snowflake_mcp-0.1.0-py3-none-any.whl (8.6 kB view details)

Uploaded Python 3

File details

Details for the file simple_snowflake_mcp-0.1.0.tar.gz.

File metadata

File hashes

Hashes for simple_snowflake_mcp-0.1.0.tar.gz
Algorithm Hash digest
SHA256 ff50c8dc8914a6c0ab5cd16188afa26e0f4bb517ab0221af68dcad4ea01da7e5
MD5 0331cfa7527c9b7cae0f79da4305bc0b
BLAKE2b-256 d1034b26a36f7c45723a2b5f74190aa9c95fd3dada464c6890cf4004647fc58c

See more details on using hashes here.

File details

Details for the file simple_snowflake_mcp-0.1.0-py3-none-any.whl.

File metadata

File hashes

Hashes for simple_snowflake_mcp-0.1.0-py3-none-any.whl
Algorithm Hash digest
SHA256 e70159ef230f4909761d817a882e605f8c805a54e03fe5daa15bc95fa68d3fee
MD5 9e97af4c714418efb5064e413f31b08b
BLAKE2b-256 8c74bca268b457cb29f242fce6f51f3b54c482995e497651effe56eaee00bd42

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page