UK broadband data analysis MCP server with Snowflake integration
Project description
Point Topic MCP Server
UK broadband data analysis server via Model Context Protocol. Simple stdio-based server for local development and Claude Desktop integration.
✅ what's implemented
database tools (requires Snowflake credentials):
assemble_dataset_context()- get schemas and examples for datasets (upc, upc_take_up, upc_forecast, tariffs, ontology)execute_query()- run safe read-only SQL queriesdescribe_table()- get table schema detailsget_la_code()/get_la_list_full()- local authority lookups
chart tools:
get_point_topic_public_chart_catalog()- browse public charts (no auth needed)get_point_topic_public_chart_csv()- get public chart data as CSV (no auth needed)get_point_topic_chart_catalog()- get complete catalog including private charts (requires API key)get_point_topic_chart_csv()- get any chart data as CSV with authentication (requires API key)generate_authenticated_chart_url()- create signed URLs for private charts (requires API key)
server info:
get_mcp_server_capabilities()- check which tools are available and debug missing credentials
conditional availability: tools only appear if required environment variables are set
installation (for end users)
simple pip install:
pip install point-topic-mcp
add to your MCP client (Claude Desktop, Cursor, etc.):
{
"mcpServers": {
"point-topic": {
"command": "point-topic-mcp",
"env": {
"SNOWFLAKE_USER": "your_user",
"SNOWFLAKE_PASSWORD": "your_password",
"CHART_API_KEY": "your_chart_api_key"
}
}
}
}
note: environment variables are optional - tools will only appear if credentials are provided. use get_mcp_server_capabilities() to check which tools are available.
Claude Desktop config location:
- Mac:
~/Library/Application Support/Claude/claude_desktop_config.json - Windows:
%APPDATA%\Claude\claude_desktop_config.json
development setup
setup: uv sync
for local development with claude desktop:
This will add the server to your claude desktop config.
uv run mcp install src/point_topic_mcp/server_local.py --with "snowflake-connector-python[pandas]" -f .env
for mcp inspector:
uv run mcp dev src/point_topic_mcp/server_local.py
environment configuration:
create .env file with your credentials:
# Snowflake database credentials (for database tools)
SNOWFLAKE_USER=your_user
SNOWFLAKE_PASSWORD=your_password
# Chart API key (for authenticated chart generation)
CHART_API_KEY=your_chart_api_key
architecture
stdio transport: communicates with MCP clients via standard input/output for local integration
auto-discovery: tools and datasets are automatically discovered from module files - no manual registration needed
conditional tools: tools only register if required environment variables are present - use get_mcp_server_capabilities() to debug
modular design:
src/point_topic_mcp/tools/- tool modules auto-discovered and registeredsrc/point_topic_mcp/context/datasets/- dataset modules auto-discovered for context assembly
adding new tools
this project uses auto-discovery for tools - just add a function and it becomes available.
tool structure
create a file in src/point_topic_mcp/tools/ ending with _tools.py:
# src/point_topic_mcp/tools/my_feature_tools.py
from typing import Optional
from mcp.server.fastmcp import Context
from mcp.server.session import ServerSession
def my_new_tool(param: str, ctx: Optional[Context[ServerSession, None]] = None) -> str:
"""Tool description visible to agents."""
# your implementation
return "result"
that's it! the tool is automatically discovered and registered.
conditional tools (require credentials)
use check_env_vars() to conditionally define tools:
from point_topic_mcp.core.utils import check_env_vars
from dotenv import load_dotenv
load_dotenv()
if check_env_vars('my_feature', ['MY_API_KEY']):
def authenticated_tool(ctx: Optional[Context[ServerSession, None]] = None) -> str:
"""Only available if MY_API_KEY is set."""
import os
api_key = os.getenv('MY_API_KEY')
# use api_key...
return "result"
key principles
- auto-discovery: any public function in
*_tools.pyfiles becomes a tool - conditional registration: wrap in
if check_env_vars()for authenticated tools - clear docstrings: visible to agents at all times - keep concise and actionable
- type hints: use for better agent understanding
adding new datasets
this project uses a modular dataset system that allows easy addition of new data sources. each dataset is self-contained and automatically discovered by the MCP server.
dataset structure
each dataset is a python module in src/point_topic_mcp/context/datasets/ with two required functions:
def get_dataset_summary():
"""Brief description visible to agents at all times.
Keep concise - this goes in every agent prompt."""
return "short description of what data is available"
def get_db_info():
"""Complete context: schema, instructions, examples.
Only loaded when agent requests this dataset."""
return f"""
{DB_INFO}
{DB_SCHEMA}
{SQL_EXAMPLES}
"""
key principles
- context window efficiency: keep
get_dataset_summary()extremely concise - it's always visible to agents - lazy loading: full context via
get_db_info()only loads when needed - self-contained: each dataset module includes all its own schema, examples, and usage notes
- auto-discovery: new
.pyfiles in the datasets directory are automatically available
adding a new dataset
- create the module:
src/point_topic_mcp/context/datasets/your_dataset.py - implement required functions:
get_dataset_summary()andget_db_info() - test locally:
uv run mcp dev src/point_topic_mcp/server_local.py - verify discovery: agent should see your dataset in
assemble_dataset_context()tool description
see existing modules (upc.py, upc_take_up.py, upc_forecast.py) for structure examples.
optimization tips
- prioritize essential info in summaries
- use clear table descriptions that help agents choose the right dataset
- include common query patterns in examples
- sanity check data against known UK facts in instructions
publishing to PyPI (for maintainers)
build and test locally:
# Build the package with UV (super fast!)
uv build
# Test installation locally
pip install dist/point_topic_mcp-*.whl
# Test the command works
point-topic-mcp
publish to PyPI:
# Set up PyPI credentials in ~/.pypirc first (one time setup)
# [pypi]
# username = __token__
# password = pypi-xxxxx...
# Publish to PyPI with the publish script
./publish_to_pypi.sh
test installation from PyPI:
pip install point-topic-mcp
point-topic-mcp
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file point_topic_mcp-0.1.51.tar.gz.
File metadata
- Download URL: point_topic_mcp-0.1.51.tar.gz
- Upload date:
- Size: 157.0 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: uv/0.8.22
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
8b29fc12d8bfc7ea32670f1603696611390391adf2bb6915551b8373cda76ab8
|
|
| MD5 |
73e013a955f89d2afbe2fdfc4d59daeb
|
|
| BLAKE2b-256 |
c36ee3f6fe3da1cd4d44d2f1ceca7d22a8b0c83fb5103ca1c65ebf48e7e607ff
|
File details
Details for the file point_topic_mcp-0.1.51-py3-none-any.whl.
File metadata
- Download URL: point_topic_mcp-0.1.51-py3-none-any.whl
- Upload date:
- Size: 41.7 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: uv/0.8.22
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
c40085fb79c0194b307815a0c124d030e7b5631d56489c8b9299abfeb2d86e1c
|
|
| MD5 |
f6dcd1d96d303d52764c28e5f1cb276a
|
|
| BLAKE2b-256 |
1d3496b1a0e87f02500bfe0c46038eb87efb4c2ab8c12602f0eddb416ac69d01
|