Skip to main content

A Snowflake MCP server — SQL queries, schema exploration, and data insights for AI assistants

Project description

mcp-snowflake-server-nsp-banner

PyPIcodecovPyPI DownloadsDocker PullsLicense: MIT

lint test Code of Conduct MCP Compatible made-with-python python-3.13+ Ruff Checked with mypy prek oxfmt Ask DeepWiki


Snowflake MCP Server NSP

A Model Context Protocol (MCP) server / MCP server that connects AI assistants to Snowflake — enabling SQL queries, schema exploration, and data insights directly from your LLM client.

Highlights:

  • Multiple authentication methods: password, key-pair, external browser, OAuth 2.0 (client credentials & bearer token), TOML connection files
  • TOML multi-connection config — manage production, staging, and development environments in one file
  • Write-safety guard — write operations are disabled by default and must be explicitly enabled
  • Exclusion patterns — filter out databases, schemas, or tables from discovery
  • --exclude-json-results flag — reduces LLM context window usage
  • Selective tool exclusion via --exclude_tools
  • Prefetch mode — pre-load table schema as MCP resources
  • Docker support with hardened image (DHI, nonroot user, no shell in runtime)

Table of Contents


Quick Start

The fastest way to try it — using uvx with a TOML connection file:

# 1. Create a connections file
cat > ~/snowflake_connections.toml << 'EOF'
[myconn]
account = "your_account"
user = "your_user"
password = "your_password"
warehouse = "COMPUTE_WH"
database = "MY_DB"
schema = "PUBLIC"
role = "MYROLE"
EOF

# 2. Run the server
uvx --python=3.13 --from mcp-snowflake-server-nsp mcp_snowflake_server \
  --connections-file ~/snowflake_connections.toml \
  --connection-name myconn

Claude Code

Add to your MCP client config (e.g. claude_desktop_config.json) using snowflake_connections.toml:

"mcpServers": {
  "snowflake": {
    "command": "uvx",
    "args": [
      "--python=3.13",
      "--from", "mcp-snowflake-server-nsp",
      "mcp_snowflake_server",
      "--connections-file", "/absolute/path/to/snowflake_connections.toml",
      "--connection-name", "myconn"
    ]
  }
}

Visual Studio Code (VSCode)

uvxInstall in VS Code Install in VS Code Insiders

DockerInstall in VS Code (Docker) Install in VS Code Insiders (Docker)

Or add manually to your MCP client config (e.g. .vscode/mcp.json) using .env file (see Authentication):

"snowflake": {
      // Snowflake MCP server
      "type": "stdio",
      "command": "uvx",
      "args": [
        "--from", "mcp-snowflake-server-nsp",
        "--python=3.13",
        "mcp_snowflake_server"
      ],
      "envFile": "${workspaceFolder}/.env"
    }

OpenCode

Add to your MCP client config (e.g. opencode.jsonc) with .env file (see Authentication):

"snowflake": {
  "type": "local",
  "command": [
    "uvx",
    "--from",
    "mcp-snowflake-server-nsp",
    "--python=3.13",
    "mcp_snowflake_server",
  ],
  "enabled": true,
  "timeout": 300000,
}

Components

Resources

URI Description
memo://insights A continuously updated memo aggregating data insights appended via append_insight.
context://table/{table_name} (Prefetch mode only) Per-table schema summaries including columns and comments.

Tools

Query Tools

Tool Description Requires
read_query Execute SELECT queries. Input: query (string).
write_query Execute INSERT, UPDATE, or DELETE queries. Input: query (string). --allow_write
create_table Execute CREATE TABLE statements. Input: query (string). --allow_write

Schema Tools

Tool Description Input
list_databases List all databases in the Snowflake instance.
list_schemas List all schemas within a database. database (string)
list_tables List all tables within a database and schema. database, schema (strings)
describe_table Describe columns of a table (name, type, nullability, default, comment). table_name as database.schema.table

Analysis Tools

Tool Description Input
append_insight Add a data insight to the memo://insights resource. insight (string)

Authentication

Password

Set credentials via environment variables or CLI flags (see Configuration Reference):

SNOWFLAKE_USER="user@example.com"
SNOWFLAKE_ACCOUNT="myaccount"
SNOWFLAKE_AUTHENTICATOR="snowflake"
SNOWFLAKE_PASSWORD="secret"
SNOWFLAKE_WAREHOUSE="COMPUTE_WH"
SNOWFLAKE_DATABASE="MY_DB"
SNOWFLAKE_SCHEMA="PUBLIC"
SNOWFLAKE_ROLE="MYROLE"

Key-Pair

SNOWFLAKE_USER="user@example.com"
SNOWFLAKE_ACCOUNT="myaccount"
SNOWFLAKE_AUTHENTICATOR="snowflake_jwt"
SNOWFLAKE_PRIVATE_KEY_FILE="/absolute/path/to/key.p8"
SNOWFLAKE_PRIVATE_KEY_FILE_PWD="passphrase"  # Optional — only if key is encrypted
SNOWFLAKE_WAREHOUSE="COMPUTE_WH"
SNOWFLAKE_DATABASE="MY_DB"
SNOWFLAKE_SCHEMA="PUBLIC"
SNOWFLAKE_ROLE="MYROLE"

Or via CLI: --private_key_file /path/to/key.p8 --private_key_file_pwd passphrase

External Browser

SNOWFLAKE_AUTHENTICATOR="externalbrowser"

Or in a TOML connection entry: authenticator = "externalbrowser"

OAuth 2.0 Client Credentials

Use the OAuth 2.0 client credentials flow to authenticate with a client ID and secret (no user interaction required):

SNOWFLAKE_AUTHENTICATOR="oauth_client_credentials"
SNOWFLAKE_ACCOUNT="myaccount"
SNOWFLAKE_OAUTH_CLIENT_ID="your_client_id"
SNOWFLAKE_OAUTH_CLIENT_SECRET="your_client_secret"
SNOWFLAKE_OAUTH_TOKEN_REQUEST_URL="https://your-idp.example.com/oauth/token"
SNOWFLAKE_OAUTH_SCOPE="session:role:MY_ROLE"  # Optional
SNOWFLAKE_WAREHOUSE="COMPUTE_WH"
SNOWFLAKE_DATABASE="MY_DB"
SNOWFLAKE_SCHEMA="PUBLIC"
SNOWFLAKE_ROLE="MYROLE"

OAuth Bearer Token

Use a pre-fetched OAuth bearer token:

SNOWFLAKE_AUTHENTICATOR="oauth"
SNOWFLAKE_ACCOUNT="myaccount"
SNOWFLAKE_TOKEN="eyJhbGciOiJSUzI1NiJ9..."
SNOWFLAKE_WAREHOUSE="COMPUTE_WH"
SNOWFLAKE_DATABASE="MY_DB"
SNOWFLAKE_SCHEMA="PUBLIC"
SNOWFLAKE_ROLE="MYROLE"

TOML Connection File (Recommended)

Manage multiple environments in a single file. See example_connections.toml for a full template.

[production]
account = "your_account"
user = "your_user"
password = "your_password"
authenticator = "snowflake"
warehouse = "COMPUTE_WH"
database = "PROD_DB"
schema = "PUBLIC"
role = "ACCOUNTADMIN"

[development]
account = "your_account"
user = "dev_user"
authenticator = "externalbrowser"
warehouse = "DEV_WH"
database = "DEV_DB"
schema = "PUBLIC"
role = "DEVELOPER"

[reporting]
account = "your_account"
user = "reporting_user"
authenticator = "snowflake_jwt"
private_key_file = "/path/to/private_key.pem"
private_key_file_pwd = "passphrase"  # Optional
warehouse = "REPORTING_WH"
database = "REPORTING_DB"
schema = "REPORTS"
role = "REPORTING_ROLE"

[analytics_oauth]
account = "your_account"
authenticator = "oauth_client_credentials"
oauth_client_id = "your_client_id"
oauth_client_secret = "your_client_secret"
oauth_token_request_url = "https://your-idp.example.com/oauth/token"
oauth_scope = "session:role:ANALYTICS_ROLE"  # Optional
warehouse = "ANALYTICS_WH"
database = "ANALYTICS_DB"
schema = "PUBLIC"
role = "ANALYTICS_ROLE"

Pass the file with --connections-file and select a profile with --connection-name. Both flags are required together.


Installation

The package is published on PyPI as mcp-snowflake-server-nsp.

Contributing or running from source? See CONTRIBUTING.md for local development setup, test commands, formatting, and building the Docker image from source.


Via UVX

TOML configuration (recommended)
"mcpServers": {
  "snowflake_production": {
    "command": "uvx",
    "args": [
      "--python=3.13",
      "--from", "mcp-snowflake-server-nsp",
      "mcp_snowflake_server",
      "--connections-file", "/path/to/snowflake_connections.toml",
      "--connection-name", "production"
      // Optional flags — see Configuration Reference
    ]
  },
  "snowflake_staging": {
    "command": "uvx",
    "args": [
      "--python=3.13",
      "--from", "mcp-snowflake-server-nsp",
      "mcp_snowflake_server",
      "--connections-file", "/path/to/snowflake_connections.toml",
      "--connection-name", "staging"
    ]
  }
}
Individual parameters
"mcpServers": {
  "snowflake": {
    "command": "uvx",
    "args": [
      "--python=3.13",
      "--from", "mcp-snowflake-server-nsp",
      "mcp_snowflake_server",
      "--account", "your_account",
      "--warehouse", "your_warehouse",
      "--user", "your_user",
      "--password", "your_password",
      "--role", "your_role",
      "--database", "your_database",
      "--schema", "your_schema"
      // Optional: "--private_key_file", "/absolute/path/key.p8"
      // Optional: "--private_key_file_pwd", "passphrase"
      // Optional flags — see Configuration Reference
    ]
  }
}

Via Docker Hub

The image is published on Docker Hub — no build step required:

docker pull nsphung/mcp-snowflake-server-nsp

Note: -i (--interactive) is required to keep stdin open for the MCP stdio transport. Do not use -d (detach).

Claude Desktop — claude_desktop_config.json

With .env file (see Authentication):

"mcpServers": {
  "snowflake": {
    "command": "docker",
    "args": [
      "run", "--rm", "-i",
      "--env-file", "/absolute/path/to/.env",
      "nsphung/mcp-snowflake-server-nsp"
    ]
  }
}

With TOML connections file:

"mcpServers": {
  "snowflake": {
    "command": "docker",
    "args": [
      "run", "--rm", "-i",
      "-v", "/path/to/snowflake_connections.toml:/app/snowflake_connections.toml:ro",
      "nsphung/mcp-snowflake-server-nsp",
      "--connections-file", "/app/snowflake_connections.toml",
      "--connection-name", "production"
    ]
  }
}
VS Code — .vscode/mcp.json

With .env file:

"snowflake": {
  "type": "stdio",
  "command": "docker",
  "args": [
    "run", "--rm", "-i",
    "nsphung/mcp-snowflake-server-nsp"
  ],
  "envFile": "${workspaceFolder}/.env"
}

With TOML connections file:

"snowflake": {
  "type": "stdio",
  "command": "docker",
  "args": [
    "run", "--rm", "-i",
    "-v", "/path/to/snowflake_connections.toml:/app/snowflake_connections.toml:ro",
    "nsphung/mcp-snowflake-server-nsp",
    "--connections-file", "/app/snowflake_connections.toml",
    "--connection-name", "production"
  ]
}
OpenCode — opencode.jsonc
"snowflake": {
  "type": "local",
  "command": [
    "docker", "run", "--rm", "-i",
    "--env-file", "/absolute/path/to/.env",
    "nsphung/mcp-snowflake-server-nsp"
  ],
  "enabled": true,
  "timeout": 300000
}

Configuration Reference

All connection parameters can also be set as environment variables (SNOWFLAKE_<PARAM_UPPER>).

Flag Env var Default Description
--account SNOWFLAKE_ACCOUNT Snowflake account identifier
--user SNOWFLAKE_USER Snowflake username
--password SNOWFLAKE_PASSWORD Password (not required for key-pair / SSO)
--warehouse SNOWFLAKE_WAREHOUSE Virtual warehouse to use
--database SNOWFLAKE_DATABASE (required) Default database
--schema SNOWFLAKE_SCHEMA (required) Default schema
--role SNOWFLAKE_ROLE Role to assume
--private_key_file SNOWFLAKE_PRIVATE_KEY_FILE Absolute path to .p8 private key file
--private_key_file_pwd SNOWFLAKE_PRIVATE_KEY_FILE_PWD Passphrase for encrypted private key
--connections-file Path to TOML connections file
--connection-name Connection profile name in TOML file (required with --connections-file)
--allow_write false Enable write_query and create_table tools
--prefetch / --no-prefetch false Pre-load table schema as context://table/* resources (disables list_tables / describe_table)
--exclude_tools [] Space-separated list of tool names to disable
--exclude-json-results false Omit embedded JSON resources from responses (reduces context window usage)
--log_dir Directory for log file output
--log_level INFO Log verbosity: DEBUG, INFO, WARNING, ERROR, CRITICAL

Exclusion Patterns

Edit runtime_config.json to exclude databases, schemas, or tables from all discovery tools. Patterns are matched case-insensitively as substrings.

{
  "exclude_patterns": {
    "databases": ["temp"],
    "schemas": ["temp", "information_schema"],
    "tables": ["temp"]
  }
}

The server loads this file automatically at startup from the working directory.


License

This project is licensed under the MIT License. See the LICENSE file for the full text.


Fork and Attribution

This repository is a fork of isaacwasserman/mcp-snowflake-server.

MseeP.ai Security Assessment Badge

  • Upstream authors and contributors retain copyright for their contributions.
  • Fork-specific changes are maintained by nsphung.
  • A summary of notable modifications is tracked in NOTICE.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

mcp_snowflake_server_nsp-0.11.0.tar.gz (111.4 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

mcp_snowflake_server_nsp-0.11.0-py3-none-any.whl (23.9 kB view details)

Uploaded Python 3

File details

Details for the file mcp_snowflake_server_nsp-0.11.0.tar.gz.

File metadata

File hashes

Hashes for mcp_snowflake_server_nsp-0.11.0.tar.gz
Algorithm Hash digest
SHA256 d6064ad2d43db14ee66bb1d700afb4eb7841ec9b4d9e815817f51aa3cebc9684
MD5 36e4d1c1f3d5d46eba7751baf8f9652b
BLAKE2b-256 de4bfab81f8a7e7b072b7d5f46b124bf79488ce7f30b4ca54ba2666c61c28fde

See more details on using hashes here.

Provenance

The following attestation bundles were made for mcp_snowflake_server_nsp-0.11.0.tar.gz:

Publisher: publish.yml on nsphung/mcp-snowflake-server

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file mcp_snowflake_server_nsp-0.11.0-py3-none-any.whl.

File metadata

File hashes

Hashes for mcp_snowflake_server_nsp-0.11.0-py3-none-any.whl
Algorithm Hash digest
SHA256 6906b34e676ca098d46cda6a85404de6f350f74467b55083cd667d2a76604276
MD5 ffbfe3bdb25b1b4f16ba089b9c5dca70
BLAKE2b-256 625ae74506342590ec0b1ae928d6b19b215414844fb91e717e930b1eae0759f8

See more details on using hashes here.

Provenance

The following attestation bundles were made for mcp_snowflake_server_nsp-0.11.0-py3-none-any.whl:

Publisher: publish.yml on nsphung/mcp-snowflake-server

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page