Skip to main content

MCP server for Crawl4AI integration

Project description

crawl4ai_mcp_service MCP server

crawl4ai_mcp_service

Components

Resources

The server implements a simple note storage system with:

  • Custom note:// URI scheme for accessing individual notes
  • Each note resource has a name, description and text/plain mimetype

Prompts

The server provides a single prompt:

  • summarize-notes: Creates summaries of all stored notes
    • Optional "style" argument to control detail level (brief/detailed)
    • Generates prompt combining all current notes with style preference

Tools

The server implements one tool:

  • add-note: Adds a new note to the server
    • Takes "name" and "content" as required string arguments
    • Updates server state and notifies clients of resource changes

Configuration

[TODO: Add configuration details specific to your implementation]

Quickstart

Install

Claude Desktop

On MacOS: ~/Library/Application\ Support/Claude/claude_desktop_config.json On Windows: %APPDATA%/Claude/claude_desktop_config.json

Development/Unpublished Servers Configuration ``` "mcpServers": { "crawl4ai_mcp_service": { "command": "uv", "args": [ "--directory", "/Users/unclecode/devs/crawl4ai_mcp_service", "run", "crawl4ai_mcp_service" ] } } ```
Published Servers Configuration ``` "mcpServers": { "crawl4ai_mcp_service": { "command": "uvx", "args": [ "crawl4ai_mcp_service" ] } } ```

Development

Building and Publishing

To prepare the package for distribution:

  1. Sync dependencies and update lockfile:
uv sync
  1. Build package distributions:
uv build

This will create source and wheel distributions in the dist/ directory.

  1. Publish to PyPI:
uv publish

Note: You'll need to set PyPI credentials via environment variables or command flags:

  • Token: --token or UV_PUBLISH_TOKEN
  • Or username/password: --username/UV_PUBLISH_USERNAME and --password/UV_PUBLISH_PASSWORD

Debugging

Since MCP servers run over stdio, debugging can be challenging. For the best debugging experience, we strongly recommend using the MCP Inspector.

You can launch the MCP Inspector via npm with this command:

npx @modelcontextprotocol/inspector uv --directory /Users/unclecode/devs/crawl4ai_mcp_service run crawl4ai-mcp-service

Upon launching, the Inspector will display a URL that you can access in your browser to begin debugging.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

crawl4ai_mcp_service-0.1.0.tar.gz (48.5 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

crawl4ai_mcp_service-0.1.0-py3-none-any.whl (9.8 kB view details)

Uploaded Python 3

File details

Details for the file crawl4ai_mcp_service-0.1.0.tar.gz.

File metadata

File hashes

Hashes for crawl4ai_mcp_service-0.1.0.tar.gz
Algorithm Hash digest
SHA256 1d6882842828aa712a549ba880be13ad4b88c25f1f03f2fecec06358733e1713
MD5 729698ca4193672bacd793d169ec2d33
BLAKE2b-256 0413c755c9a9a78faee980319b311500ec43e584aa5ff29e13cc3bbde27297c6

See more details on using hashes here.

File details

Details for the file crawl4ai_mcp_service-0.1.0-py3-none-any.whl.

File metadata

File hashes

Hashes for crawl4ai_mcp_service-0.1.0-py3-none-any.whl
Algorithm Hash digest
SHA256 1c4dacaea6d8c5f0afafe3a50859e8996e78dbd68d51f51afa1870f8371241f7
MD5 5da0d3ed49bf11bce12461cc6857b2b6
BLAKE2b-256 19cc07b7cca9bbfc865a66b5a5557614db4bd408ce3edd1f6b90e82651acdc48

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page