Skip to main content

Google Search Console integration for Model Context Protocol (MCP)

Project description

Google Search Console MCP server for SEOs

April 2026 (v0.3.0): Coming to the Cursor Marketplace — one-click install with bundled SEO skills. Also: token storage moved to user config dir (survives uvx upgrades), all data tools now return structured JSON, and 39 new unit tests. See the Changelog for details.

A Model Context Protocol (MCP) server that connects Google Search Console (GSC) to AI assistants, allowing you to analyze your SEO data through natural language conversations. Works with Claude, Cursor, Codex, Gemini CLI, Antigravity, and any other MCP-compatible client. This integration gives you access to property information, search analytics, URL inspection, and sitemap management—all through simple chat.


Cursor Marketplace

One-click install available — search for mcp-search-console in the Cursor Marketplace.

After installing, configure your credentials (see Getting Started below) then use the bundled skills directly in Cursor Agent chat:

Skill How to invoke What it does
seo-weekly-report "Run the SEO weekly report for example.com" Full 28-day performance summary with period-over-period comparison and top queries
cannibalization-check "Check for keyword cannibalization on example.com" Finds queries where multiple pages compete; recommends which to keep
indexing-audit "Audit indexing for my top pages" Batch-inspects top 20 pages and returns a prioritized fix list
content-opportunities "Find content opportunities for example.com" Surfaces position-11-20 queries with high impressions and low CTR

Required environment variables (set in Cursor MCP settings after install)

Variable Required Description
GSC_OAUTH_CLIENT_SECRETS_FILE One of these two Path to your OAuth client_secrets.json
GSC_CREDENTIALS_PATH One of these two Path to your service account credentials JSON
GSC_DATA_STATE Optional all (default, matches GSC dashboard) or final (2–3 day lag)
GSC_ALLOW_DESTRUCTIVE Optional Set to true to enable add/delete site and delete sitemap tools

First-time authentication (OAuth users only)

After installing, ask your AI assistant: "Authenticate my Google Search Console" — it will run the reauthenticate tool which opens a browser window once to authorize access. Subsequent uses are token-based and require no interaction.


What Can This Tool Do For SEO Professionals?

  1. Property Management

    • See all your GSC properties in one place
    • Get verification details and basic site information
    • Add new properties to your account
    • Remove properties from your account
  2. Search Analytics & Reporting

    • Discover which search queries bring visitors to your site
    • Track impressions, clicks, and click-through rates
    • Analyze performance trends over time
    • Compare different time periods to spot changes
    • Visualize your data with charts and graphs created by Claude
  3. URL Inspection & Indexing

    • Check if specific pages have indexing problems
    • See when Google last crawled your pages
    • Inspect multiple URLs at once to identify patterns
    • Get actionable insights on how to improve indexing
  4. Sitemap Management

    • View all your sitemaps and their status
    • Submit new sitemaps directly through Claude
    • Check for errors or warnings in your sitemaps
    • Monitor sitemap processing status

Available Tools

Here's what you can ask your AI assistant to do once you've set up this integration:

What You Can Ask For What It Does What You'll Need to Provide
list_properties Shows all your GSC properties Nothing - just ask!
get_site_details Shows details about a specific site Your website URL
add_site Adds a new site to your GSC properties Your website URL
delete_site Removes a site from your GSC properties Your website URL
get_search_analytics Shows top queries and pages with metrics Your website URL, time period, and optional row_limit (default 20, max 500)
get_performance_overview Gives a summary of site performance Your website URL and time period
check_indexing_issues Checks if pages have indexing problems Your website URL and list of pages to check
inspect_url_enhanced Detailed inspection of a specific URL Your website URL and the page to inspect
get_sitemaps Lists all sitemaps for your site Your website URL
submit_sitemap Submits a new sitemap to Google Your website URL and sitemap URL

For a complete list of all 20 available tools and their detailed descriptions, ask your AI assistant to "list tools" after setup.


Getting Started (No Coding Experience Required!)

1. Set Up Google Search Console API Access

Before using this tool, you'll need to create API credentials that allow your AI assistant to access your GSC data:

Authentication Options

The tool supports two authentication methods:

1. OAuth Authentication (Recommended)

This method allows you to authenticate with your own Google account, which is often more convenient than using a service account. It will have access to the same resources you normally do.

Set GSC_SKIP_OAUTH to "true", "1", or "yes" to skip OAuth authentication and use only service account authentication

Setup Instructions:
  1. Go to the Google Cloud Console and create a Google Cloud account if you don't have one
  2. Create a new project or select an existing one
  3. Enable the Search Console API for your project
  4. Add scope https://www.googleapis.com/auth/webmasters to your project
  5. Go to the "Credentials" page
  6. Click "Create Credentials" and select "OAuth client ID"
  7. Configure the OAuth consent screen
  8. For application type, select "Desktop app"
  9. Give your OAuth client a name and click "Create"
  10. Download the client secrets JSON file (it will be named something like client_secrets.json)
  11. Either set the GSC_OAUTH_CLIENT_SECRETS_FILE environment variable to the absolute path of this file (required if installing via uvx), or, if you cloned the repo manually, place the file in the same directory as gsc_server.py

When you run the tool for the first time with OAuth authentication, it will open a browser window asking you to sign in to your Google account and authorize the application. After authorization, the tool will save the token for future use.

2. Service Account Authentication

This method uses a service account, which is useful for automated scripts or when you don't want to use your personal Google account. This requires adding the service account as a user in Google Search Console.

Setup Instructions:
  1. Go to the Google Cloud Console and create a Google Cloud account if you don't have one
  2. Create a new project or select an existing one
  3. Enable the Search Console API for your project
  4. Go to the "Credentials" page
  5. Click "Create Credentials" and select "Service Account"
  6. Fill in the service account details and click "Create"
  7. Click on the newly created service account
  8. Go to the "Keys" tab and click "Add Key" > "Create new key"
  9. Select JSON format and click "Create"
  10. Download the key file. Either set the GSC_CREDENTIALS_PATH environment variable to the absolute path of this file (required if installing via uvx), or, if you cloned the repo manually, save it as service_account_credentials.json in the same directory as gsc_server.py
  11. Add your service account email address to appropriate Search Console properties

🎬 Watch this beginner-friendly tutorial on Youtube:

Click the image above to watch the step-by-step video tutorial

2. Install Required Software

You'll need to install these tools on your computer:

Make sure Python is properly installed and available in your system path before proceeding.

3. Install the MCP Server

Option A — uvx (simplest, no clone needed)

If you have uv installed, you can skip cloning entirely. Use this config directly in step 5:

{
  "mcpServers": {
    "gscServer": {
      "command": "uvx",
      "args": ["mcp-search-console"],
      "env": {
        "GSC_CREDENTIALS_PATH": "/FULL/PATH/TO/service_account_credentials.json",
        "GSC_SKIP_OAUTH": "true"
      }
    }
  }
}

uvx installs the server in an isolated environment automatically and keeps it up to date. No virtual environment management needed. Skip to Step 5 if using this option.

Important for uvx users: You must set GSC_CREDENTIALS_PATH (or GSC_OAUTH_CLIENT_SECRETS_FILE) to an absolute path. Simply placing credential files in your project folder is not sufficient — uvx runs the code from an internal cache directory you cannot reach, so any "same directory as the script" instructions elsewhere in this README apply only to the clone-install path (Option B).

PyPI package name: The official package is mcp-search-console. A third-party package named mcp-gsc exists on PyPI but is not maintained by this project — make sure you install mcp-search-console.

Option B — Clone manually (more control)

Download this tool to your computer. The easiest way is:

  1. Click the green "Code" button at the top of this page
  2. Select "Download ZIP"
  3. Unzip the downloaded file to a location you can easily find (like your Documents folder)

Alternatively, if you're familiar with Git:

git clone https://github.com/AminForou/mcp-gsc.git

4. Install Required Components (Option B only)

Open your computer's Terminal (Mac) or Command Prompt (Windows):

  1. Navigate to the folder where you unzipped the files:

    # Example (replace with your actual path):
    cd ~/Documents/mcp-gsc-main
    
  2. Create a virtual environment (this keeps the project dependencies isolated):

    # Using uv (recommended):
    uv venv .venv
    
    # If uv is not installed, install it first:
    pip install uv
    # Then create the virtual environment:
    uv venv .venv
    
    # OR using standard Python:
    python -m venv .venv
    

    Note: If you get a "pip not found" error when trying to install uv, see the "If you get 'pip not found' error" section below.

  3. Activate the virtual environment:

    # On Mac/Linux:
    source .venv/bin/activate
    
    # On Windows:
    .venv\Scripts\activate
    
  4. Install the required dependencies:

    # Using uv:
    uv pip install -r requirements.txt
    
    # OR using standard pip:
    pip install -r requirements.txt
    

    If you get "pip not found" error:

    # First ensure pip is installed and updated:
    python3 -m ensurepip --upgrade
    python3 -m pip install --upgrade pip
    
    # Then try installing the requirements again:
    python3 -m pip install -r requirements.txt
    
    # Or to install uv:
    python3 -m pip install uv
    

When you see (.venv) at the beginning of your command prompt, it means the virtual environment is active and the dependencies will be installed there without affecting your system Python installation.

5. Connect Your AI Client to Google Search Console

The configuration below uses Claude Desktop as an example. For other clients (Cursor, Codex, Gemini CLI, Antigravity), the JSON structure is the same — check your client's documentation for where the config file lives.

  1. Download and install Claude Desktop if you haven't already
  2. Make sure you have your Google credentials file saved somewhere on your computer
  3. Open your computer's Terminal (Mac) or Command Prompt (Windows) and type:
   # For Mac users:
   nano ~/Library/Application\ Support/Claude/claude_desktop_config.json
   
   # For Windows users:
   notepad %APPDATA%\Claude\claude_desktop_config.json
  1. Add the following configuration text (this tells your AI client how to connect to GSC):

OAuth authentication (using your own account)

{
  "mcpServers": {
    "gscServer": {
      "command": "/FULL/PATH/TO/-main/.venv/bin/python",
      "args": ["/FULL/PATH/TO/mcp-gsc-main/gsc_server.py"],
      "env": {
        "GSC_OAUTH_CLIENT_SECRETS_FILE": "/FULL/PATH/TO/client_secrets.json",
        "GSC_DATA_STATE": "all"
      }
    }
  }
}

Service account authentication

{
  "mcpServers": {
    "gscServer": {
      "command": "/FULL/PATH/TO/-main/.venv/bin/python",
      "args": ["/FULL/PATH/TO/mcp-gsc-main/gsc_server.py"],
      "env": {
        "GSC_CREDENTIALS_PATH": "/FULL/PATH/TO/service_account_credentials.json",
        "GSC_SKIP_OAUTH": "true",
        "GSC_DATA_STATE": "all"
      }
    }
  }
}

Environment Variables Reference

Variable Required Default Description
GSC_OAUTH_CLIENT_SECRETS_FILE OAuth only client_secrets.json next to gsc_server.py (clone install only) Absolute path to your OAuth client secrets JSON file. Required when installing via uvx.
GSC_CREDENTIALS_PATH Service account only service_account_credentials.json next to gsc_server.py (clone install only) Absolute path to your service account JSON key file. Required when installing via uvx.
GSC_SKIP_OAUTH No false Set to "true" to force service account auth and skip OAuth
GSC_DATA_STATE No "all" "all" returns fresh data matching the GSC dashboard. "final" returns only confirmed data (2–3 day lag).

Important: Replace all paths with the actual locations on your computer:

  • The first path should point to the Python executable inside your virtual environment
  • The second path should point to the gsc_server.py file inside the folder you unzipped
  • The third path should point to your Google service account credentials JSON file

Examples:

  • Mac:
    • Python path: /Users/yourname/Documents/mcp-gsc/.venv/bin/python
    • Script path: /Users/yourname/Documents/mcp-gsc/gsc_server.py
  • Windows:
    • Python path: C:\\Users\\yourname\\Documents\\mcp-gsc\\.venv\\Scripts\\python.exe
    • Script path: C:\\Users\\yourname\\Documents\\mcp-gsc\\gsc_server.py
  1. Save the file:

    • Mac: Press Ctrl+O, then Enter, then Ctrl+X to exit
    • Windows: Click File > Save, then close Notepad
  2. Restart your AI client

  3. When it opens, you should now see GSC tools available in the tools section

6. Start Analyzing Your SEO Data!

Now you can ask your AI assistant questions about your GSC data! It can not only retrieve the data but also analyze it, explain trends, and create visualizations to help you understand your SEO performance better.

Here are some powerful prompts you can use with each tool:

Tool Name Sample Prompt
list_properties "List all my GSC properties and tell me which ones have the most pages indexed."
get_site_details "Analyze the verification status of mywebsite.com and explain what the ownership details mean."
add_site "Add my new website https://mywebsite.com to Search Console and verify its status."
delete_site "Remove the old test site https://test.mywebsite.com from Search Console."
get_search_analytics "Show me the top 20 search queries for mywebsite.com in the last 30 days, highlight any with CTR below 2%, and suggest title improvements."
get_performance_overview "Create a visual performance overview of mywebsite.com for the last 28 days, identify any unusual drops or spikes, and explain possible causes."
check_indexing_issues "Check these important pages for indexing issues and prioritize which ones need immediate attention: mywebsite.com/product, mywebsite.com/services, mywebsite.com/about"
inspect_url_enhanced "Do a comprehensive inspection of mywebsite.com/landing-page and give me actionable recommendations to improve its indexing status."
batch_url_inspection "Inspect my top 5 product pages, identify common crawling or indexing patterns, and suggest technical SEO improvements."
get_sitemaps "List all sitemaps for mywebsite.com, identify any with errors, and recommend next steps."
list_sitemaps_enhanced "Analyze all my sitemaps for mywebsite.com, focusing on error patterns, and create a prioritized action plan."
submit_sitemap "Submit my new product sitemap at https://mywebsite.com/product-sitemap.xml and explain how long it typically takes for Google to process it."
get_sitemap_details "Check the status of my main sitemap at mywebsite.com/sitemap.xml and explain what the warnings mean for my SEO."
get_search_by_page_query "What search terms are driving traffic to my blog post at mywebsite.com/blog/post-title? Identify opportunities to optimize for related keywords."
compare_search_periods "Compare my site's performance between January and February. What queries improved the most, which declined, and what might explain these changes?"
get_advanced_search_analytics "Analyze queries with high impressions but positions below 10, filtered to mobile traffic in the US only. Use filters with country=usa and device=MOBILE."

You can also ask your AI assistant to combine multiple tools and analyze the results. For example:

  • "Find my top 20 landing pages by traffic, check their indexing status, and create a report highlighting any pages with both high traffic and indexing issues."

  • "Analyze my site's performance trend over the last 90 days, identify my fastest-growing queries, and check if the corresponding landing pages have any technical issues."

  • "Compare my desktop vs. mobile search performance, visualize the differences with charts, and recommend specific pages that need mobile optimization based on performance gaps."

  • "Identify queries where I'm ranking on page 2 (positions 11-20) that have high impressions but low CTR, then inspect the corresponding URLs and suggest title and meta description improvements."

Your AI assistant will use the GSC tools to fetch the data, present it in an easy-to-understand format, create visualizations when helpful, and provide actionable insights based on the results.


Data Visualization Capabilities

Your AI assistant can help you visualize your GSC data in various ways:

  • Trend Charts: See how metrics change over time
  • Comparison Graphs: Compare different time periods or dimensions
  • Performance Distributions: Understand how your content performs across positions
  • Correlation Analysis: Identify relationships between different metrics
  • Heatmaps: Visualize complex datasets with color-coded representations

Simply ask your AI assistant to "visualize" or "create a chart" when analyzing your data, and it will generate appropriate visualizations to help you understand the information better.


Troubleshooting

Python Command Not Found

On macOS, the default Python command is often python3 rather than python, which can cause issues with some applications including Node.js integrations.

If you encounter errors related to Python not being found, you can create an alias:

  1. Create a Python alias (one-time setup):

    # For macOS users:
    sudo ln -s $(which python3) /usr/local/bin/python
    
    # If that doesn't work, try finding your Python installation:
    sudo ln -s /Library/Frameworks/Python.framework/Versions/3.11/bin/python3 /usr/local/bin/python
    
  2. Verify the alias works:

    python --version
    

This creates a symbolic link so that when applications call python, they'll actually use your python3 installation.

"Service account credentials file not found" or "Authentication failed" (especially with uvx)

If you installed via uvx and see an authentication error even though you placed your credentials JSON file in your cloned repo folder, this is the expected cause: uvx runs the code from an internal cache directory (~/.cache/uv/archive-v0/...), not from your project folder. The server can't find credential files by filename in the "script directory" because that directory is the uv cache.

Fix: Set the credential path explicitly as an absolute path in your MCP client config:

  • For service accounts: "GSC_CREDENTIALS_PATH": "/Users/you/creds/service_account_credentials.json" (use C:\\Users\\you\\... on Windows)
  • For OAuth: "GSC_OAUTH_CLIENT_SECRETS_FILE": "/Users/you/creds/client_secrets.json"

If you set one of these env vars but still get an error, the server will now tell you the exact resolved path it tried — double-check for typos and confirm the file exists at that path. ~/ and $HOME are expanded, so ~/creds.json also works.

AI Client Configuration Issues

If you're having trouble connecting:

  1. Make sure all file paths in your configuration are correct and use the full path
  2. Check that your service account has access to your GSC properties
  3. Restart your AI client after making any changes
  4. Look for error messages in the response when you try to use a tool
  5. Ensure your virtual environment is activated when running the server manually

Other Unexpected Issues

If you encounter any other unexpected issues during installation or usage:

  1. Copy the exact error message you're receiving
  2. Use ChatGPT or Claude and explain your problem in detail, including:
    • What you were trying to do
    • The exact error message
    • Your operating system
    • Any steps you've already tried
  3. AI assistants can often help diagnose and resolve technical issues by suggesting specific solutions for your situation

Remember that most issues have been encountered by others before, and there's usually a straightforward solution available.


Safety: Destructive Operations

By default, the tools that can permanently modify your GSC account (add_site, delete_site, delete_sitemap) are disabled. If you ask the AI to "clean things up" or "remove old properties", it will explain the safety restriction instead of deleting data.

To enable these tools, set the GSC_ALLOW_DESTRUCTIVE environment variable:

# In your MCP client config (Claude Desktop, Cursor, etc.)
GSC_ALLOW_DESTRUCTIVE=true

If you never use add/delete operations, you don't need to do anything — your existing setup works exactly as before.


Remote Deployment & Docker (Advanced)

The standard setup above runs the server locally on your machine. This section is only for users who want to run it on a remote server, in a container, or share it with a team — existing local users don't need any of this.

HTTP Transport

By default the server communicates over stdio (standard input/output), which only works locally. To run it as a network server, set the MCP_TRANSPORT environment variable:

MCP_TRANSPORT=sse MCP_HOST=0.0.0.0 MCP_PORT=3001 python gsc_server.py

Your MCP client then connects to http://your-server:3001/sse instead of launching the process locally.

Variable Default Description
MCP_TRANSPORT stdio Set to sse for network/remote use
MCP_HOST 127.0.0.1 Host to bind (use 0.0.0.0 for all interfaces)
MCP_PORT 3001 Port to bind

Docker

A Dockerfile is included in the repo. Build and run:

# Build the image
docker build -t mcp-gsc .

# Run locally (stdio mode — for testing)
docker run -v /path/to/client_secrets.json:/app/client_secrets.json mcp-gsc

# Run as a network server (SSE mode — for remote use)
docker run \
  -e MCP_TRANSPORT=sse \
  -e MCP_HOST=0.0.0.0 \
  -e MCP_PORT=3001 \
  -e GSC_CREDENTIALS_PATH=/app/credentials.json \
  -v /path/to/credentials.json:/app/credentials.json \
  -p 3001:3001 \
  mcp-gsc

Cloud Platforms

The Docker image works on any container platform. Set MCP_TRANSPORT=sse, MCP_HOST=0.0.0.0, and inject credentials via environment variables or mounted secrets:

  • Railway — connect your repo, set env vars in the dashboard
  • Render — deploy as a Web Service, set env vars under Environment
  • Fly.iofly deploy, set secrets with fly secrets set

Related Tools

If you work with Google Search Console regularly, you may also find these tools useful:

Advanced GSC Visualizer — A Chrome extension (14,000+ users) that brings powerful charts, annotations, and one-click API access directly inside Google Search Console. Features include:

  • Interactive charts with trendlines, moving averages, and Google algorithm update overlays
  • One-click export of up to 25,000 rows from the GSC API — no coding required
  • Keyword cannibalization detection
  • Crawl stats visualizations
  • AI assistant for querying your GSC data directly in the browser

Built by the same author. Install from the Chrome Web Store →


Contributing

Found a bug or have an idea for improvement? We welcome your input! Open an issue or submit a pull request on GitHub.


License

This project is licensed under the MIT License. See the LICENSE file for details.


Changelog

[0.3.0] — April 2026

  • Cursor Marketplace plugin — added .cursor-plugin/plugin.json, mcp.json, and 4 bundled SEO skills (seo-weekly-report, cannibalization-check, indexing-audit, content-opportunities)
  • Stable token storage — OAuth token now stored in the platform user config dir (~/Library/Application Support/mcp-gsc/ on macOS, ~/.config/mcp-gsc/ on Linux) instead of the package directory; survives uvx upgrades. Existing tokens are silently migrated on first run.
  • Structured JSON output — all 13 data-returning tools now return structured JSON (json.dumps) instead of pipe-separated text, improving AI reasoning accuracy
  • Dependency fix — added platformdirs>=4.0.0; removed deprecated oauth2client from requirements.txt
  • MCP safety — fixed stdout pollution (print()logging.warning()) that could corrupt stdio MCP protocol; replaced silent browser-flow hang in MCP context with clear RuntimeError directing users to reauthenticate
  • Test suite — 39 unit tests covering auth, all 13 data tools, safety guards, stdout cleanliness, and token migration (zero real credentials required)

[0.2.2] — April 2026

Added

  • Safety mode for destructive tools: add_site, delete_site, and delete_sitemap are now disabled by default. Set GSC_ALLOW_DESTRUCTIVE=true to enable them. This prevents accidental deletion of GSC properties through vague AI instructions.
  • HTTP/SSE transport: Set MCP_TRANSPORT=sse (plus MCP_HOST and MCP_PORT) to run the server as a network service instead of a local process. Enables Docker, cloud, and team deployments.
  • Dockerfile: Official container image using the uv base image. Includes .dockerignore to prevent credential files from being baked into images.
  • CLAUDE.md: Project context file for AI coding assistants — covers auth, env vars, and how to add new tools.

Fixed

  • Sitemap warning status: get_sitemaps now correctly shows "Has warnings" when a sitemap has warnings but no errors. Previously, warnings were silently ignored in the status field. (Thanks @nloadholtes!)

Improved

  • PyPI package: pyproject.toml now correctly declares gsc_server.py as the installable module. pip install mcp-gsc and uvx mcp-gsc now produce a working installation. (Thanks @jjeejj!)

[0.2.0] — March 2026

Added

  • Data freshness: All search analytics queries now use dataState: "all" by default, returning data that matches the GSC dashboard instead of finalized-only data (which lags 2–3 days). Configurable via the GSC_DATA_STATE environment variable ("all" or "final").
  • Flexible row limits: get_search_analytics and get_search_by_page_query now accept an optional row_limit parameter (default 20, max 500). Claude will automatically choose an appropriate value based on your request — use higher values for comprehensive analysis, lower values for quick overviews.
  • Multi-dimension filtering: get_advanced_search_analytics now accepts a filters parameter — a JSON array of filter objects for AND logic across multiple dimensions simultaneously (e.g., country = USA and device = mobile). The existing single-filter parameters (filter_dimension, filter_operator, filter_expression) remain fully supported.

[0.2.1] — March 2026

Added

  • Reauthenticate tool: New reauthenticate tool lets you switch Google accounts by deleting the saved OAuth token and triggering a fresh browser login. Ask your AI assistant: "switch to a different Google account". (Thanks @fterenzani!)

Fixed

  • Sitemap TypeError crash: get_sitemaps and list_sitemaps_enhanced crashed with TypeError when a sitemap had errors or warnings, because the GSC API returns those counts as strings. Added int() casts before comparison. (Thanks @mcprobert!)
  • File cache warning: Suppressed the file_cache is only supported with oauth2client<4.0.0 warning that caused crashes on MCP hosts that treat any stderr output as fatal (e.g. GitHub Copilot CLI).
  • Domain property 404 errors: All tools now return a clear, actionable message when a 404 occurs, explaining the exact format required and service account permission requirements for sc-domain: properties.

Improved

  • Multi-client support: README now explicitly lists Claude, Cursor, Codex, Gemini CLI, and Antigravity as supported clients with setup guidance for each.
  • site_url guidance: All 15 tool docstrings now explain how to get the exact property URL from list_properties and how domain properties relate to subdomain filtering.

[0.1.0] — Initial release

  • 19 tools covering property management, search analytics, URL inspection, and sitemap management
  • OAuth and service account authentication
  • Batch URL inspection (up to 10 URLs)
  • Period comparison tool

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

mcp_search_console-0.3.2.tar.gz (50.8 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

mcp_search_console-0.3.2-py3-none-any.whl (30.0 kB view details)

Uploaded Python 3

File details

Details for the file mcp_search_console-0.3.2.tar.gz.

File metadata

  • Download URL: mcp_search_console-0.3.2.tar.gz
  • Upload date:
  • Size: 50.8 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.14.4

File hashes

Hashes for mcp_search_console-0.3.2.tar.gz
Algorithm Hash digest
SHA256 54437d6258b3775f6006a004fe3911f01491d4e058a7dd4ca7abc2d5b7d8d26b
MD5 ca684c75f1c35993ed070987b73d533d
BLAKE2b-256 e94d5f24a587c642bfb80cf0e6b25313f9f587670a03213a173466c5f7e87285

See more details on using hashes here.

File details

Details for the file mcp_search_console-0.3.2-py3-none-any.whl.

File metadata

File hashes

Hashes for mcp_search_console-0.3.2-py3-none-any.whl
Algorithm Hash digest
SHA256 aa5749f4b90439a38f1c0c2a2e80511c141282112b6f868f08c3295c57181aed
MD5 7d44307a66c1086e95df7160643fbb29
BLAKE2b-256 f1588fd6b7565ab049fb67702be4c1cd6da1754ea3e85653941e5de9722ab729

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page