Skip to main content

UK broadband data analysis MCP server with Snowflake integration

Project description

Point Topic MCP Server

UK broadband data analysis server via Model Context Protocol. Simple stdio-based server for local development and Claude Desktop integration.

✅ what's implemented

core tools:

  • assemble_dataset_context() - gets database schemas and examples for datasets (upc, upc_take_up, upc_forecast, ontology)
  • execute_query() - runs safe read-only SQL queries against Snowflake

authentication: environment variables for Snowflake credentials

installation (for end users)

simple pip install:

pip install point-topic-mcp

add to your MCP client (Claude Desktop, Cursor, etc.):

{
  "mcpServers": {
    "point-topic": {
      "command": "point-topic-mcp",
      "env": {
        "SNOWFLAKE_USER": "your_user", 
        "SNOWFLAKE_PASSWORD": "your_password"
      }
    }
  }
}

Claude Desktop config location:

  • Mac: ~/Library/Application Support/Claude/claude_desktop_config.json
  • Windows: %APPDATA%\Claude\claude_desktop_config.json

development setup

setup: uv sync

for local development with claude desktop:

This will add the server to your claude desktop config.

uv run mcp install src/point_topic_mcp/server_local.py --with "snowflake-connector-python[pandas]" -f .env

for mcp inspector:

uv run mcp dev src/point_topic_mcp/server_local.py

environment configuration:

create .env file with your snowflake credentials:

# Your Snowflake credentials
SNOWFLAKE_USER=your_user
SNOWFLAKE_PASSWORD=your_password

architecture

stdio transport: communicates with MCP clients (Claude Desktop, Cursor) via standard input/output for local integration

tools: two main tools for Snowflake data analysis

  • assemble_dataset_context() - provides schemas and context for available datasets
  • execute_query() - executes safe read-only SQL queries

authentication: uses environment variables for Snowflake database credentials

adding new datasets

this project uses a modular dataset system that allows easy addition of new data sources. each dataset is self-contained and automatically discovered by the MCP server.

dataset structure

each dataset is a python module in src/point_topic_mcp/context/datasets/ with two required functions:

def get_dataset_summary():
    """Brief description visible to agents at all times.
    Keep concise - this goes in every agent prompt."""
    return "short description of what data is available"

def get_db_info():
    """Complete context: schema, instructions, examples.
    Only loaded when agent requests this dataset."""
    return f"""
    {DB_INFO}
    
    {DB_SCHEMA}
    
    {SQL_EXAMPLES}
    """

key principles

  1. context window efficiency: keep get_dataset_summary() extremely concise - it's always visible to agents
  2. lazy loading: full context via get_db_info() only loads when needed
  3. self-contained: each dataset module includes all its own schema, examples, and usage notes
  4. auto-discovery: new .py files in the datasets directory are automatically available

adding a new dataset

  1. create the module: src/point_topic_mcp/context/datasets/your_dataset.py
  2. implement required functions: get_dataset_summary() and get_db_info()
  3. test locally: uv run mcp dev src/point_topic_mcp/server_local.py
  4. verify discovery: agent should see your dataset in assemble_dataset_context() tool description

see existing modules (upc.py, upc_take_up.py, upc_forecast.py) for structure examples.

optimization tips

  • prioritize essential info in summaries
  • use clear table descriptions that help agents choose the right dataset
  • include common query patterns in examples
  • sanity check data against known UK facts in instructions

publishing to PyPI (for maintainers)

build and test locally:

# Build the package with UV (super fast!)
uv build

# Test installation locally
pip install dist/point_topic_mcp-*.whl

# Test the command works
point-topic-mcp

publish to PyPI:

# Set up PyPI credentials in ~/.pypirc first (one time setup)
# [pypi]
#   username = __token__
#   password = pypi-xxxxx...

# Publish to PyPI with the publish script
./publish_to_pypi.sh

test installation from PyPI:

pip install point-topic-mcp
point-topic-mcp

Project details


Release history Release notifications | RSS feed

Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

point_topic_mcp-0.1.35.tar.gz (154.2 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

point_topic_mcp-0.1.35-py3-none-any.whl (36.6 kB view details)

Uploaded Python 3

File details

Details for the file point_topic_mcp-0.1.35.tar.gz.

File metadata

  • Download URL: point_topic_mcp-0.1.35.tar.gz
  • Upload date:
  • Size: 154.2 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: uv/0.8.22

File hashes

Hashes for point_topic_mcp-0.1.35.tar.gz
Algorithm Hash digest
SHA256 61cf671b7cde9ad1078815e7d736e6775a1959d1c1faf7592d0a3ae3dc49a45b
MD5 a9bcee42d5c81dc2b75365168867b742
BLAKE2b-256 36f73d1ddd074c361d5e323b078f680c177c209aa5b6bfbce5eb68234b3afeae

See more details on using hashes here.

File details

Details for the file point_topic_mcp-0.1.35-py3-none-any.whl.

File metadata

File hashes

Hashes for point_topic_mcp-0.1.35-py3-none-any.whl
Algorithm Hash digest
SHA256 018dbfa75a05347baea474869466a7841d1bea3cd25d9843927c866dc86e968c
MD5 dd5d65f88d11d8f1fef0a7c18ef58b5c
BLAKE2b-256 9543417e4017dbe51184e5b77576c5c4f4f8828499b608f27c7cbee11f3390ac

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page