Skip to main content

MCP server for search and retrieval of web crawler content

Project description

MCP Server Webcrawl

Website | Github | Docs | PyPi

mcp-server-webcrawl

Bridge the gap between your web crawl and AI language models using Model Context Protocol (MCP). With mcp-server-webcrawl, your AI client filters and analyzes web content under your direction or autonomously. The server includes a full-text search interface with boolean support, resource filtering by type, HTTP status, and more.

mcp-server-webcrawl provides the LLM a complete menu with which to search your web content, and works with a variety of web crawlers:

mcp-server-webcrawl is free and open source, and requires Claude Desktop, Python (>=3.10). It is installed on the command line, via pip install:

pip install mcp-server-webcrawl

Features

  • Claude Desktop ready
  • Fulltext search support
  • Filter by type, status, and more
  • Multi-crawler compatible
  • ChatGPT support coming soon

MCP Configuration

From the Claude Desktop menu, navigate to File > Settings > Developer. Click Edit Config to locate the configuration file, open in the editor of your choice and modify the example to reflect your datasrc path.

You can set up more mcp-server-webcrawl connections under mcpServers as needed.

{
  "mcpServers": {
    "webcrawl": {
      "command": [varies by OS/env, see below],
       "args": [varies by crawler, see below]
    }
  }
}

For step-by-step setup, refer to the Setup Guides.

Windows vs. macOS

On Windows with Python installed on path, the command should simply be mcp-server-webcrawl.

On macOS, you must use the absolute path to the mcp-server-webcrawl executable in the command field, rather than just the command name.

For example:

"command": "/Users/yourusername/.local/bin/mcp-server-webcrawl",

To find the absolute path of the mcp-server-webcrawl executable on your system:

  1. Open Terminal
  2. Run which mcp-server-webcrawl
  3. Copy the full path returned and use it in your config file

wget (using --mirror)

The datasrc argument should be set to the parent directory of the mirrors.

"args": ["--crawler", "wget", "--datasrc", "/path/to/wget/archives/"]

WARC

The datasrc argument should be set to the parent directory of the WARC files.

"args": ["--crawler", "warc", "--datasrc", "/path/to/warc/archives/"]

InterroBot

The datasrc argument should be set to the direct path to the database.

"args": ["--crawler", "interrobot", "--datasrc", "/path/to/Documents/InterroBot/interrobot.v2.db"]

Katana

The datasrc argument should be set to the parent directory of the text cache files.

"args": ["--crawler", "katana", "--datasrc", "/path/to/katana/archives/"]

SiteOne (using archiving)

The datasrc argument should be set to the parent directory of the archives, archiving must be enabled.

"args": ["--crawler", "siteone", "--datasrc", "/path/to/SiteOne/archives/"]

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

mcp_server_webcrawl-0.8.0.tar.gz (56.0 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

mcp_server_webcrawl-0.8.0-py3-none-any.whl (71.3 kB view details)

Uploaded Python 3

File details

Details for the file mcp_server_webcrawl-0.8.0.tar.gz.

File metadata

  • Download URL: mcp_server_webcrawl-0.8.0.tar.gz
  • Upload date:
  • Size: 56.0 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.1.0 CPython/3.13.2

File hashes

Hashes for mcp_server_webcrawl-0.8.0.tar.gz
Algorithm Hash digest
SHA256 ad549ad24e2c3f013ecfcc2bf0883d0e68b6cd6c79eb34a46a1b23192c605007
MD5 d080dbd6a5e6d7038d4693020e1782ed
BLAKE2b-256 61115a166ce34fd75d7dbc92408a36fc069a04e3e49c75ccdd85087dc5f3875b

See more details on using hashes here.

File details

Details for the file mcp_server_webcrawl-0.8.0-py3-none-any.whl.

File metadata

File hashes

Hashes for mcp_server_webcrawl-0.8.0-py3-none-any.whl
Algorithm Hash digest
SHA256 b7886f5dbfc5b6f84fa20b8967c7f78728edce174d8362fe2b31349950245502
MD5 d63e3a71c83d79488c4f1b1d4b8d34ab
BLAKE2b-256 ebff7f3c78810565c6bd3ae216ad60680eae1d4c78fd166ca504ae61bbab91b7

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page