Skip to main content

a web cralwer for claude mcp

Project description

mcp-server-spidy MCP server

a web crawler for claude mcp

Components

Resources

The server implements a simple note storage system with:

  • Custom note:// URI scheme for accessing individual notes
  • Each note resource has a name, description, and text/plain mimetype

Prompts

The server provides a single prompt:

  • summarize-notes: Creates summaries of all stored notes
    • Optional "style" argument to control detail level (brief/detailed)
    • Generates prompt combining all current notes with style preference

Tools

The server implements two tools:

  • web-crawl: Crawls a website and saves the results to a text file
    • Takes "url" and "output_file" as required string arguments
    • Fetches the content from the specified URL and writes it to the specified text file

Configuration

[TODO: Add configuration details specific to your implementation]

Quickstart

Install

Claude Desktop

On MacOS: ~/Library/Application\ Support/Claude/claude_desktop_config.json On Windows: %APPDATA%/Claude/claude_desktop_config.json

Usage

To use the web crawling tool, you can call it with the following parameters:

  • url: The URL of the website you want to crawl.
  • output_file: The name of the file where the crawled content will be saved.
Published Servers Configuration ``` "mcpServers": { "mcp-server-spidy": { "command": "uvx", "args": [ "mcp-server-spidy" ] } } ```

Development

Building and Publishing

To prepare the package for distribution:

  1. Sync dependencies and update lockfile:
uv sync
  1. Build package distributions:
uv build

This will create source and wheel distributions in the dist/ directory.

  1. Publish to PyPI:
uv publish

Note: You'll need to set PyPI credentials via environment variables or command flags:

  • Token: --token or UV_PUBLISH_TOKEN
  • Or username/password: --username/UV_PUBLISH_USERNAME and --password/UV_PUBLISH_PASSWORD

Debugging

Since MCP servers run over stdio, debugging can be challenging. For the best debugging experience, we strongly recommend using the MCP Inspector.

You can launch the MCP Inspector via npm with this command:

npx @modelcontextprotocol/inspector uv --directory /Users/jrob/repos/spidy run mcp-server-spidy

Upon launching, the Inspector will display a URL that you can access in your browser to begin debugging.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

mcp_server_spidy-0.1.2.tar.gz (48.8 kB view details)

Uploaded Source

Built Distribution

mcp_server_spidy-0.1.2-py3-none-any.whl (5.3 kB view details)

Uploaded Python 3

File details

Details for the file mcp_server_spidy-0.1.2.tar.gz.

File metadata

  • Download URL: mcp_server_spidy-0.1.2.tar.gz
  • Upload date:
  • Size: 48.8 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: uv/0.5.6

File hashes

Hashes for mcp_server_spidy-0.1.2.tar.gz
Algorithm Hash digest
SHA256 411b2b58bf3e4cbdc8d26b438a121165f4886920f81ea7d75657207fa5526031
MD5 59ffd926fe601daa6ea2eea032ff6c5b
BLAKE2b-256 5d38fe6801cb02880c3b2ca42636eeddb9370207a2e56176eff381bdde11baba

See more details on using hashes here.

File details

Details for the file mcp_server_spidy-0.1.2-py3-none-any.whl.

File metadata

File hashes

Hashes for mcp_server_spidy-0.1.2-py3-none-any.whl
Algorithm Hash digest
SHA256 78cf25298e77b995c73e8d9f01e39d1d81d481cccdf5ce14e07c17631f1a1305
MD5 29fcae5dfff7d6f56a91566fde7c226a
BLAKE2b-256 fcb1a0757d7ddd8db47a2201c4e0f00f8a17f6558d87db697dd2bfef3be2197d

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page