UK broadband data analysis MCP server with Snowflake integration
Project description
Point Topic MCP Server
UK broadband data analysis server via Model Context Protocol. Simple stdio-based server for local development and Claude Desktop integration.
✅ what's implemented
core tools:
assemble_dataset_context()- gets database schemas and examples for datasets (upc, upc_take_up, upc_forecast, ontology)execute_query()- runs safe read-only SQL queries against Snowflake
authentication: environment variables for Snowflake credentials
installation (for end users)
simple pip install:
pip install point-topic-mcp
add to your MCP client (Claude Desktop, Cursor, etc.):
{
"mcpServers": {
"point-topic": {
"command": "point-topic-mcp",
"env": {
"SNOWFLAKE_USER": "your_user",
"SNOWFLAKE_PASSWORD": "your_password"
}
}
}
}
Claude Desktop config location:
- Mac:
~/Library/Application Support/Claude/claude_desktop_config.json - Windows:
%APPDATA%\Claude\claude_desktop_config.json
development setup
setup: uv sync
for local development with claude desktop:
This will add the server to your claude desktop config.
uv run mcp install src/point_topic_mcp/server_local.py --with "snowflake-connector-python[pandas]" -f .env
for mcp inspector:
uv run mcp dev src/point_topic_mcp/server_local.py
environment configuration:
create .env file with your snowflake credentials:
# Your Snowflake credentials
SNOWFLAKE_USER=your_user
SNOWFLAKE_PASSWORD=your_password
architecture
stdio transport: communicates with MCP clients (Claude Desktop, Cursor) via standard input/output for local integration
tools: two main tools for Snowflake data analysis
assemble_dataset_context()- provides schemas and context for available datasetsexecute_query()- executes safe read-only SQL queries
authentication: uses environment variables for Snowflake database credentials
adding new datasets
this project uses a modular dataset system that allows easy addition of new data sources. each dataset is self-contained and automatically discovered by the MCP server.
dataset structure
each dataset is a python module in src/point_topic_mcp/context/datasets/ with two required functions:
def get_dataset_summary():
"""Brief description visible to agents at all times.
Keep concise - this goes in every agent prompt."""
return "short description of what data is available"
def get_db_info():
"""Complete context: schema, instructions, examples.
Only loaded when agent requests this dataset."""
return f"""
{DB_INFO}
{DB_SCHEMA}
{SQL_EXAMPLES}
"""
key principles
- context window efficiency: keep
get_dataset_summary()extremely concise - it's always visible to agents - lazy loading: full context via
get_db_info()only loads when needed - self-contained: each dataset module includes all its own schema, examples, and usage notes
- auto-discovery: new
.pyfiles in the datasets directory are automatically available
adding a new dataset
- create the module:
src/point_topic_mcp/context/datasets/your_dataset.py - implement required functions:
get_dataset_summary()andget_db_info() - test locally:
uv run mcp dev src/point_topic_mcp/server_local.py - verify discovery: agent should see your dataset in
assemble_dataset_context()tool description
see existing modules (upc.py, upc_take_up.py, upc_forecast.py) for structure examples.
optimization tips
- prioritize essential info in summaries
- use clear table descriptions that help agents choose the right dataset
- include common query patterns in examples
- sanity check data against known UK facts in instructions
publishing to PyPI (for maintainers)
build and test locally:
# Build the package with UV (super fast!)
uv build
# Test installation locally
pip install dist/point_topic_mcp-*.whl
# Test the command works
point-topic-mcp
publish to PyPI:
# Set up PyPI credentials in ~/.pypirc first (one time setup)
# [pypi]
# username = __token__
# password = pypi-xxxxx...
# Publish to PyPI with the publish script
./publish_to_pypi.sh
test installation from PyPI:
pip install point-topic-mcp
point-topic-mcp
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file point_topic_mcp-0.1.36.tar.gz.
File metadata
- Download URL: point_topic_mcp-0.1.36.tar.gz
- Upload date:
- Size: 154.2 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: uv/0.8.22
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
d476b82f805314f00e0d47a5b2f03e5fb46b5db70431873ee910dbfbea277c8a
|
|
| MD5 |
8686ccdffada7b99396c1c139dea1d24
|
|
| BLAKE2b-256 |
0e0f2cafd7f07fcd852f9f6408d0d470439361d69675ab4329f083a1a8691397
|
File details
Details for the file point_topic_mcp-0.1.36-py3-none-any.whl.
File metadata
- Download URL: point_topic_mcp-0.1.36-py3-none-any.whl
- Upload date:
- Size: 36.6 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: uv/0.8.22
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
ec8e69b48e5a682be7b50fbf2e2589062bdff804dc01b67f0c551153365731d5
|
|
| MD5 |
fd665b082b8b232c952993fb587e48e0
|
|
| BLAKE2b-256 |
22f375e033dc8ae5a491d2334bbeaca519c3aa9d5bc45587bf82426af169e8c6
|