Skip to main content

UK broadband data analysis MCP server with Snowflake integration

Project description

Point Topic MCP Server

UK broadband data analysis server via Model Context Protocol. Simple stdio-based server for local development and Claude Desktop integration.

✅ what's implemented

database tools (requires Snowflake credentials):

  • assemble_dataset_context() - get schemas and examples for datasets (upc, upc_take_up, upc_forecast, tariffs, ontology)
  • execute_query() - run safe read-only SQL queries
  • describe_table() - get table schema details
  • get_la_code() / get_la_list_full() - local authority lookups

chart tools:

  • get_point_topic_public_chart_catalog() - browse public charts (no auth needed)
  • get_point_topic_public_chart_csv() - get public chart data as CSV (no auth needed)
  • get_point_topic_chart_catalog() - get complete catalog including private charts (requires API key)
  • get_point_topic_chart_csv() - get any chart data as CSV with authentication (requires API key)
  • generate_authenticated_chart_url() - create signed URLs for private charts (requires API key)

server info:

  • get_mcp_server_capabilities() - check which tools are available and debug missing credentials

conditional availability: tools only appear if required environment variables are set

installation (for end users)

simple pip install:

pip install point-topic-mcp

add to your MCP client (Claude Desktop, Cursor, etc.):

{
  "mcpServers": {
    "point-topic": {
      "command": "point-topic-mcp",
      "env": {
        "SNOWFLAKE_USER": "your_user", 
        "SNOWFLAKE_PASSWORD": "your_password",
        "CHART_API_KEY": "your_chart_api_key"
      }
    }
  }
}

note: environment variables are optional - tools will only appear if credentials are provided. use get_mcp_server_capabilities() to check which tools are available.

Claude Desktop config location:

  • Mac: ~/Library/Application Support/Claude/claude_desktop_config.json
  • Windows: %APPDATA%\Claude\claude_desktop_config.json

development setup

setup: uv sync

for local development with claude desktop:

This will add the server to your claude desktop config.

uv run mcp install src/point_topic_mcp/server_local.py --with "snowflake-connector-python[pandas]" -f .env

for mcp inspector:

uv run mcp dev src/point_topic_mcp/server_local.py

environment configuration:

create .env file with your credentials:

# Snowflake database credentials (for database tools)
SNOWFLAKE_USER=your_user
SNOWFLAKE_PASSWORD=your_password

# Chart API key (for authenticated chart generation)
CHART_API_KEY=your_chart_api_key

architecture

stdio transport: communicates with MCP clients via standard input/output for local integration

auto-discovery: tools and datasets are automatically discovered from module files - no manual registration needed

conditional tools: tools only register if required environment variables are present - use get_mcp_server_capabilities() to debug

modular design:

  • src/point_topic_mcp/tools/ - tool modules auto-discovered and registered
  • src/point_topic_mcp/context/datasets/ - dataset modules auto-discovered for context assembly

adding new tools

this project uses auto-discovery for tools - just add a function and it becomes available.

tool structure

create a file in src/point_topic_mcp/tools/ ending with _tools.py:

# src/point_topic_mcp/tools/my_feature_tools.py

from typing import Optional
from mcp.server.fastmcp import Context
from mcp.server.session import ServerSession

def my_new_tool(param: str, ctx: Optional[Context[ServerSession, None]] = None) -> str:
    """Tool description visible to agents."""
    # your implementation
    return "result"

that's it! the tool is automatically discovered and registered.

conditional tools (require credentials)

use check_env_vars() to conditionally define tools:

from point_topic_mcp.core.utils import check_env_vars
from dotenv import load_dotenv

load_dotenv()

if check_env_vars('my_feature', ['MY_API_KEY']):
    def authenticated_tool(ctx: Optional[Context[ServerSession, None]] = None) -> str:
        """Only available if MY_API_KEY is set."""
        import os
        api_key = os.getenv('MY_API_KEY')
        # use api_key...
        return "result"

key principles

  1. auto-discovery: any public function in *_tools.py files becomes a tool
  2. conditional registration: wrap in if check_env_vars() for authenticated tools
  3. clear docstrings: visible to agents at all times - keep concise and actionable
  4. type hints: use for better agent understanding

adding new datasets

this project uses a modular dataset system that allows easy addition of new data sources. each dataset is self-contained and automatically discovered by the MCP server.

dataset structure

each dataset is a python module in src/point_topic_mcp/context/datasets/ with two required functions:

def get_dataset_summary():
    """Brief description visible to agents at all times.
    Keep concise - this goes in every agent prompt."""
    return "short description of what data is available"

def get_db_info():
    """Complete context: schema, instructions, examples.
    Only loaded when agent requests this dataset."""
    return f"""
    {DB_INFO}
    
    {DB_SCHEMA}
    
    {SQL_EXAMPLES}
    """

key principles

  1. context window efficiency: keep get_dataset_summary() extremely concise - it's always visible to agents
  2. lazy loading: full context via get_db_info() only loads when needed
  3. self-contained: each dataset module includes all its own schema, examples, and usage notes
  4. auto-discovery: new .py files in the datasets directory are automatically available

adding a new dataset

  1. create the module: src/point_topic_mcp/context/datasets/your_dataset.py
  2. implement required functions: get_dataset_summary() and get_db_info()
  3. test locally: uv run mcp dev src/point_topic_mcp/server_local.py
  4. verify discovery: agent should see your dataset in assemble_dataset_context() tool description

see existing modules (upc.py, upc_take_up.py, upc_forecast.py) for structure examples.

optimization tips

  • prioritize essential info in summaries
  • use clear table descriptions that help agents choose the right dataset
  • include common query patterns in examples
  • sanity check data against known UK facts in instructions

publishing to PyPI (for maintainers)

build and test locally:

# Build the package with UV (super fast!)
uv build

# Test installation locally
pip install dist/point_topic_mcp-*.whl

# Test the command works
point-topic-mcp

publish to PyPI:

# Set up PyPI credentials in ~/.pypirc first (one time setup)
# [pypi]
#   username = __token__
#   password = pypi-xxxxx...

# Publish to PyPI with the publish script
./publish_to_pypi.sh

test installation from PyPI:

pip install point-topic-mcp
point-topic-mcp

Project details


Release history Release notifications | RSS feed

Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

point_topic_mcp-0.1.65.tar.gz (168.8 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

point_topic_mcp-0.1.65-py3-none-any.whl (48.1 kB view details)

Uploaded Python 3

File details

Details for the file point_topic_mcp-0.1.65.tar.gz.

File metadata

  • Download URL: point_topic_mcp-0.1.65.tar.gz
  • Upload date:
  • Size: 168.8 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: uv/0.8.22

File hashes

Hashes for point_topic_mcp-0.1.65.tar.gz
Algorithm Hash digest
SHA256 ab0bcf224a082a8109c7359e9b7efc0e9ab014c59762658ff49a32d457c7889b
MD5 0efb797262f3065f317ab064a07ca908
BLAKE2b-256 58b6d42bdecae6237e9a12a84d73bcb5e2bd7d84f51f51b88677f0c5c51f1251

See more details on using hashes here.

File details

Details for the file point_topic_mcp-0.1.65-py3-none-any.whl.

File metadata

File hashes

Hashes for point_topic_mcp-0.1.65-py3-none-any.whl
Algorithm Hash digest
SHA256 db1e3652dd6f6fe10a2c259e7c728f12833f50134baf719aed841e85715b0ca5
MD5 ca4dce6a7b502d9213ce68c269a0a726
BLAKE2b-256 abf314de1e15c12e95e9d9a701da2f1f4dd574da81a36e9224d55bfc24f5cdd1

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page