Skip to main content

UK broadband data analysis MCP server with Snowflake integration

Project description

Point Topic MCP Server

UK broadband data analysis server via Model Context Protocol with API key authentication and user permissions.

✅ what's implemented

three server modes:

  • local development (server_local.py) - stdio transport, no auth, claude desktop
  • remote http (server_remote.py) - streamable-http transport, api key auth, modern clients
  • remote sse (server_remote_sse.py) - sse transport, api key auth, broader compatibility

core tools:

  • assemble_dataset_context() - gets database schemas and examples for datasets (upc, upc_take_up, upc_forecast)
  • execute_query() - runs safe read-only sql queries with user-based row limits
  • check_user_permissions() - shows user access levels and restrictions

user permissions: email-based access control via config/users.yaml with three levels (basic, premium, full)

authentication: API key based with bearer token format

installation (for end users)

simple pip install:

pip install point-topic-mcp

add to your MCP client (Claude Desktop, Cursor, etc.):

{
  "mcpServers": {
    "point-topic": {
      "command": "point-topic-mcp",
      "env": {
        "SNOWFLAKE_ACCOUNT": "your_account",
        "SNOWFLAKE_USER": "your_user", 
        "SNOWFLAKE_PASSWORD": "your_password",
        "SNOWFLAKE_WAREHOUSE": "your_warehouse",
        "SNOWFLAKE_DATABASE": "your_database",
        "SNOWFLAKE_SCHEMA": "your_schema"
      }
    }
  }
}

Claude Desktop config location:

  • Mac: ~/Library/Application Support/Claude/claude_desktop_config.json
  • Windows: %APPDATA%\Claude\claude_desktop_config.json

quick start - local development

setup: uv sync

for claude desktop:

./deploy.sh

for mcp inspector:

uv run mcp dev server.py

network access with api key authentication

for colleagues to use your server

they just need:

  1. your server URL (http or sse)
  2. their API key from config/users.yaml
  3. configure their MCP client with URL + API key

server setup (run locally or on network)

1. environment configuration:

create .env file with your snowflake credentials:

# Your Snowflake credentials
SNOWFLAKE_ACCOUNT=your_account
SNOWFLAKE_USER=your_user
SNOWFLAKE_PASSWORD=your_password
SNOWFLAKE_WAREHOUSE=your_warehouse
SNOWFLAKE_DATABASE=your_database
SNOWFLAKE_SCHEMA=your_schema

2. add colleague emails and generate API keys:

edit config/users.yaml:

users:
  colleague1@gmail.com:
    access_level: premium
    name: "Colleague Name"
    api_key: "pt_live_sk_abc123..." # Generate unique API key

  colleague2@company.com:
    access_level: basic
    name: "Another Colleague"
    api_key: "pt_live_sk_def456..." # Generate unique API key

3. choose and start your server:

option a: streamable http (modern, recommended)

python server_remote.py
# runs on http://localhost:8000/mcp
# colleagues use: Authorization: Bearer pt_live_sk_...

option b: sse transport (broader compatibility)

python server_remote_sse.py
# runs on http://localhost:8001/sse
# colleagues use: Authorization: Bearer pt_live_sk_...

find your IP: ifconfig (mac/linux) or ipconfig (windows)

share with colleagues: server URL + their API key from users.yaml

deployment options

local network: colleagues on same wifi can access via your IP

cloud deployment: vps/cloud hosting for internet access

testing:

  • http server: uv run mcp dev http://localhost:8000/mcp
  • sse server: uv run mcp dev http://localhost:8001/sse

user permissions system

three access levels defined in config/users.yaml:

  • basic: limited datasets, 1000 row limit
  • premium: more datasets, 50000 row limit
  • full: all datasets, unlimited rows

example configuration:

# config/users.yaml
users:
  peter.donaghey@point-topic.com:
    access_level: full
    name: "Peter Donaghey"

  colleague@gmail.com:
    access_level: premium
    datasets: ["upc", "upc_take_up"]
    expires: "2025-12-31"

access_levels:
  basic:
    description: "Aggregated data only"
    datasets: ["upc"]
    row_limit: 1000
    tools: ["assemble_dataset_context", "execute_query"]

  premium:
    description: "Detailed data access"
    datasets: ["upc", "upc_take_up"]
    row_limit: 50000
    tools: ["assemble_dataset_context", "execute_query"]

  full:
    description: "Complete access"
    datasets: ["upc", "upc_take_up", "upc_forecast"]
    row_limit: null
    tools: ["*"]

architecture

three server modes:

  • server_local.py - stdio transport, no auth, claude desktop integration
  • server_remote.py - streamable-http transport, api key auth, network access
  • server_remote_sse.py - sse transport, api key auth, broader compatibility

authentication flow:

  1. user configures MCP client with server URL + API key
  2. client sends API key in Authorization header: Bearer pt_live_sk_...
  3. server validates API key against config/users.yaml
  4. tools enforce user restrictions (datasets, row limits)

transport comparison:

  • streamable-http: newest MCP standard, better performance, fewer clients support
  • sse: traditional MCP transport, broader client compatibility, server-sent events

publishing to PyPI (for maintainers)

build and test locally:

# Build the package with UV (super fast!)
uv build

# Test installation locally
uv add ./dist/point_topic_mcp-*.whl

# Test the command works
point-topic-mcp --help

publish to PyPI:

# Set up PyPI credentials in ~/.pypirc first (one time setup)
# [pypi]
#   username = __token__
#   password = pypi-xxxxx...

# Upload to PyPI with UV
uv publish

# Or use the publish script
./publish.sh

test installation from PyPI:

pip install point-topic-mcp
point-topic-mcp

Project details


Release history Release notifications | RSS feed

Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

point_topic_mcp-0.1.6.tar.gz (131.7 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

point_topic_mcp-0.1.6-py3-none-any.whl (23.8 kB view details)

Uploaded Python 3

File details

Details for the file point_topic_mcp-0.1.6.tar.gz.

File metadata

  • Download URL: point_topic_mcp-0.1.6.tar.gz
  • Upload date:
  • Size: 131.7 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: uv/0.8.17

File hashes

Hashes for point_topic_mcp-0.1.6.tar.gz
Algorithm Hash digest
SHA256 162e39ac3976100c8f4a9924bff31ee953d3a0d7dd6c9a744d7fb58d7e84091e
MD5 a0c83fb5d8232457fc1cfbf85bc7870d
BLAKE2b-256 387a5735eabe79937df0f3b276ab45b6f58f9b2432b54a3f13f2ee8da6ee00d3

See more details on using hashes here.

File details

Details for the file point_topic_mcp-0.1.6-py3-none-any.whl.

File metadata

File hashes

Hashes for point_topic_mcp-0.1.6-py3-none-any.whl
Algorithm Hash digest
SHA256 b22fc6a31bd8adde54774c7f2b1ab86effb845b8bbaa45c20f9011f0de7b7626
MD5 30499b473846ebe3ce71f6a3884eb754
BLAKE2b-256 f7eb8c5b87f4d678ed028b1f25115e52ec4a8b4f368d3e3c104b360df959349e

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page