Skip to main content

MCP server for code editors, connects to Google BigQuery

Project description

Nuuly BigQuery MCP Server

This MCP server provides AI code editors with access to Google BigQuery data through the Model Context Protocol (MCP). It allows AI assistants to understand BigQuery datasets and tables, and to run queries against them. This implementation acts as a client that forwards requests to a remote BigQuery MCP server running on Google Cloud Run.

Features

  • List available BigQuery datasets
  • Get detailed schema information for tables in a dataset
  • Run SQL queries against BigQuery datasets
  • Remote execution via Cloud Run service

Prerequisites

  • Python 3.8+

Installation

From PyPI (Recommended)

Install the package directly from PyPI:

pip install nuuly-bigquery-mcp-server

From Source (Not Recommended)

  1. Clone the repository:

    git clone https://github.com/urbn/r15-mcp.git
    cd r15-mcp/mcp_servers/bigquery-mcp
    
  2. Install in development mode:

    pip install -e .
    

Environment Variables

The server requires the following environment variables:

  • BQ_API_KEY (required): API key for authentication with the BigQuery MCP server (see Confluence)
  • BIGQUERY_MCP_SERVER_URL (optional): URL of the remote BigQuery MCP server (use: https://bigquery-mcp-toolbox-oe7jbzhmjq-uk.a.run.app/mcp/invoke)

Configure your MCP Server

You must add this MCP server to the MCP configuration file for your LLM. After installing the package, the configuration becomes much simpler:

Example MCP Configuration for Claude Desktop:

{
  "nuuly-bigquery-mcp": {
    "command": "nuuly-bigquery-mcp",
    "env": {
      "BQ_API_KEY": "your-api-key-here",
      "BIGQUERY_MCP_SERVER_URL": "https://bigquery-mcp-toolbox-oe7jbzhmjq-uk.a.run.app/mcp/invoke"
    }
  }
}

Note: Replace your-api-key-here with your actual BigQuery MCP API key.

Available Tools

The server provides the following tools, which are forwarded to the remote BigQuery MCP server:

1. list_databases

Lists all available BigQuery datasets.

list_databases()

2. get_schema

Gets the schema of all tables in a specified dataset.

get_schema(database="your_dataset_name")

3. run_query

Runs a SQL query against a BigQuery dataset.

run_query(database="your_dataset_name", sql="SELECT * FROM your_table LIMIT 10")

Example Usage

Here's an example of how to use the BigQuery MCP server with Claude:

I need to analyze data in BigQuery. Can you help me understand what datasets are available?

Claude will use the list_databases tool to show available datasets.

Now I want to see the schema of the 'analytics' dataset.

Claude will use the get_schema tool to show the tables and their schemas in the 'analytics' dataset.

Run a query to get the top 5 products by sales from the sales_data table.

Claude will use the run_query tool to execute the SQL query and display the results.

License

Copyright © 2025 URBN Inc.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

nuuly_bigquery_mcp_server-1.0.2.tar.gz (4.4 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

nuuly_bigquery_mcp_server-1.0.2-py3-none-any.whl (5.1 kB view details)

Uploaded Python 3

File details

Details for the file nuuly_bigquery_mcp_server-1.0.2.tar.gz.

File metadata

File hashes

Hashes for nuuly_bigquery_mcp_server-1.0.2.tar.gz
Algorithm Hash digest
SHA256 bb04c86ef6273d610f4089f7ed233ce7cbef7368552016e7aba61e8f69723de8
MD5 2b4c9cf675ede49d586eb10080381e0c
BLAKE2b-256 4ac5241b8c60223c39112e7a790a4ee982aa676a363a99cff0f867c0f902786d

See more details on using hashes here.

File details

Details for the file nuuly_bigquery_mcp_server-1.0.2-py3-none-any.whl.

File metadata

File hashes

Hashes for nuuly_bigquery_mcp_server-1.0.2-py3-none-any.whl
Algorithm Hash digest
SHA256 e9b7600d1ae3ff8b0f713e49ef0467c9c1792eb153a168d8de296d29c29666ce
MD5 db77ad37f9b132cbc2eba463abe10441
BLAKE2b-256 2e6268e084ef14c6f3f69ac526f7847c061a773fba84301451597f66514fa6be

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page