Skip to main content

UK broadband data analysis MCP server with Snowflake integration

Project description

Point Topic MCP Server

UK broadband data analysis server via Model Context Protocol. Simple stdio-based server for local development and Claude Desktop integration.

✅ what's implemented

core tools:

  • assemble_dataset_context() - gets database schemas and examples for datasets (upc, upc_take_up, upc_forecast)
  • execute_query() - runs safe read-only SQL queries against Snowflake

authentication: environment variables for Snowflake credentials

installation (for end users)

simple pip install:

pip install point-topic-mcp

add to your MCP client (Claude Desktop, Cursor, etc.):

{
  "mcpServers": {
    "point-topic": {
      "command": "point-topic-mcp",
      "env": {
        "SNOWFLAKE_ACCOUNT": "your_account",
        "SNOWFLAKE_USER": "your_user", 
        "SNOWFLAKE_PASSWORD": "your_password",
        "SNOWFLAKE_WAREHOUSE": "your_warehouse",
        "SNOWFLAKE_DATABASE": "your_database",
        "SNOWFLAKE_SCHEMA": "your_schema"
      }
    }
  }
}

Claude Desktop config location:

  • Mac: ~/Library/Application Support/Claude/claude_desktop_config.json
  • Windows: %APPDATA%\Claude\claude_desktop_config.json

development setup

setup: uv sync

for local development with claude desktop:

This will add the server to your claude desktop config.

uv run mcp install server_local.py --with "snowflake-connector-python[pandas]" -f .env

for mcp inspector:

uv run mcp dev src/point_topic_mcp/server_local.py

environment configuration:

create .env file with your snowflake credentials:

# Your Snowflake credentials
SNOWFLAKE_ACCOUNT=your_account
SNOWFLAKE_USER=your_user
SNOWFLAKE_PASSWORD=your_password
SNOWFLAKE_WAREHOUSE=your_warehouse
SNOWFLAKE_DATABASE=your_database
SNOWFLAKE_SCHEMA=your_schema

architecture

stdio transport: communicates with MCP clients (Claude Desktop, Cursor) via standard input/output for local integration

tools: two main tools for Snowflake data analysis

  • assemble_dataset_context() - provides schemas and context for available datasets
  • execute_query() - executes safe read-only SQL queries

authentication: uses environment variables for Snowflake database credentials

publishing to PyPI (for maintainers)

build and test locally:

# Build the package with UV (super fast!)
uv build

# Test installation locally
pip install dist/point_topic_mcp-*.whl

# Test the command works
point-topic-mcp

publish to PyPI:

# Set up PyPI credentials in ~/.pypirc first (one time setup)
# [pypi]
#   username = __token__
#   password = pypi-xxxxx...

# Upload to PyPI with UV
uv publish

# Or use the publish script
./publish.sh

test installation from PyPI:

pip install point-topic-mcp
point-topic-mcp

Project details


Release history Release notifications | RSS feed

Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

point_topic_mcp-0.1.12.tar.gz (130.4 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

point_topic_mcp-0.1.12-py3-none-any.whl (23.1 kB view details)

Uploaded Python 3

File details

Details for the file point_topic_mcp-0.1.12.tar.gz.

File metadata

  • Download URL: point_topic_mcp-0.1.12.tar.gz
  • Upload date:
  • Size: 130.4 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: uv/0.8.17

File hashes

Hashes for point_topic_mcp-0.1.12.tar.gz
Algorithm Hash digest
SHA256 733f1a7edcffe5a4fa9ce324a6d19537fea00717f9fa080ad8cba8014b30521e
MD5 b2cebb49d022ded2d5356786d6ad7f39
BLAKE2b-256 9e27cc86cdd26b5063d3c3deb828e456b23b3f158b98c11a28c366a07f8926f5

See more details on using hashes here.

File details

Details for the file point_topic_mcp-0.1.12-py3-none-any.whl.

File metadata

File hashes

Hashes for point_topic_mcp-0.1.12-py3-none-any.whl
Algorithm Hash digest
SHA256 5dbd9a3e28c0f047223b236a99cafc9536e968ba86c6af77901af1fedced6dd7
MD5 290f8f35d5a67581a33cdb849fac6a27
BLAKE2b-256 87961cc17d8f03b94cecc3f9088320f0f421936c4790494d254a85213b211c77

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page