Skip to main content

MCP server for code editors, connects to Google BigQuery

Project description

Nuuly BigQuery MCP Server

This MCP server provides AI code editors with access to Google BigQuery data through the Model Context Protocol (MCP). It allows AI assistants to understand BigQuery datasets and tables, and to run queries against them. This implementation acts as a client that forwards requests to a remote BigQuery MCP server running on Google Cloud Run.

Features

  • List available BigQuery datasets
  • Get detailed schema information for tables in a dataset
  • Run SQL queries against BigQuery datasets
  • Remote execution via Cloud Run service

Prerequisites

  • Python 3.8+

Installation

From PyPI (Recommended)

Install the package directly from PyPI:

pip install nuuly-bigquery-mcp-server

From Source (Not Recommended)

  1. Clone the repository:

    git clone https://github.com/urbn/r15-mcp.git
    cd r15-mcp/mcp_servers/bigquery-mcp
    
  2. Install in development mode:

    pip install -e .
    

Environment Variables

The server requires the following environment variables:

  • BQ_API_KEY (required): API key for authentication with the BigQuery MCP server (see Confluence)
  • BIGQUERY_MCP_SERVER_URL (optional): URL of the remote BigQuery MCP server (use: https://bigquery-mcp-toolbox-oe7jbzhmjq-uk.a.run.app/mcp/invoke)

Configure your MCP Server

You must add this MCP server to the MCP configuration file for your LLM. After installing the package, the configuration becomes much simpler:

Example MCP Configuration for Claude Desktop:

{
  "nuuly-bigquery-mcp": {
    "command": "nuuly-bigquery-mcp",
    "env": {
      "BQ_API_KEY": "your-api-key-here",
      "BIGQUERY_MCP_SERVER_URL": "https://bigquery-mcp-toolbox-oe7jbzhmjq-uk.a.run.app/mcp/invoke"
    }
  }
}

Note: Replace your-api-key-here with your actual BigQuery MCP API key.

Available Tools

The server provides the following tools, which are forwarded to the remote BigQuery MCP server:

1. list_databases

Lists all available BigQuery datasets.

list_databases()

2. get_schema

Gets the schema of all tables in a specified dataset.

get_schema(database="your_dataset_name")

3. run_query

Runs a SQL query against a BigQuery dataset.

run_query(database="your_dataset_name", sql="SELECT * FROM your_table LIMIT 10")

Example Usage

Here's an example of how to use the BigQuery MCP server with Claude:

I need to analyze data in BigQuery. Can you help me understand what datasets are available?

Claude will use the list_databases tool to show available datasets.

Now I want to see the schema of the 'analytics' dataset.

Claude will use the get_schema tool to show the tables and their schemas in the 'analytics' dataset.

Run a query to get the top 5 products by sales from the sales_data table.

Claude will use the run_query tool to execute the SQL query and display the results.

License

Copyright © 2025 URBN Inc.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

nuuly_bigquery_mcp_server-1.0.1.tar.gz (4.4 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

nuuly_bigquery_mcp_server-1.0.1-py3-none-any.whl (5.1 kB view details)

Uploaded Python 3

File details

Details for the file nuuly_bigquery_mcp_server-1.0.1.tar.gz.

File metadata

File hashes

Hashes for nuuly_bigquery_mcp_server-1.0.1.tar.gz
Algorithm Hash digest
SHA256 00d25c55d864910eff01597653404bb3a2a08dca2eb4e7f10c22982dce7f98f6
MD5 7429d0adf200b248fb9f799e2400f29b
BLAKE2b-256 1c5b97e9af821b72a94ff5d4cfa3647ef6abecbaca9872b35c09110f4a77098d

See more details on using hashes here.

File details

Details for the file nuuly_bigquery_mcp_server-1.0.1-py3-none-any.whl.

File metadata

File hashes

Hashes for nuuly_bigquery_mcp_server-1.0.1-py3-none-any.whl
Algorithm Hash digest
SHA256 d18708314fd5433a56030d4a83867f525c7cdcb5161ec85c1220e0387fb40d1f
MD5 059c8d32ba2e2fd7f60ccff8d5eca357
BLAKE2b-256 4a2405a830144bb56a85b4c5a307a70c8f0c6ab8b0ab0b87a8cf92410abea02b

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page