Skip to main content

A Model Context Protocol (MCP) server for accessing documentation

Project description

Docy Logo

Note: Claude may default to using its built-in WebFetchTool instead of Docy. To explicitly request Docy's functionality, use a callout like: "Please use Docy to find..."

Docy MCP Server

A Model Context Protocol server that provides documentation access capabilities. This server enables LLMs to search and retrieve content from documentation websites by scraping them with crawl4ai. Built with FastMCP v2.

Using Docy

Here are examples of how Docy can help with common documentation tasks:

# Verify implementation against documentation
Are we implementing Crawl4Ai scrape results correctly? Let's check the documentation.

# Explore API usage patterns
What do the docs say about using mcp.tool? Show me examples from the documentation.

# Compare implementation options
How should we structure our data according to the React documentation? What are the best practices?

With Docy, Claude Code can directly access and analyze documentation from configured sources, making it more effective at providing accurate, documentation-based guidance.

To ensure Claude Code prioritizes Docy for documentation-related tasks, add the following guidelines to your project's CLAUDE.md file:

## Documentation Guidelines
- When checking documentation, prefer using Docy over WebFetchTool
- Use mcp__docy__list_documentation_sources_tool to discover available documentation sources
- Use mcp__docy__fetch_documentation_page to retrieve full documentation pages
- Use mcp__docy__fetch_document_links to discover related documentation

Adding these instructions to your CLAUDE.md file helps Claude Code consistently use Docy instead of its built-in web fetch capabilities when working with documentation.

Available Tools

  • list_documentation_sources_tool - List all available documentation sources

    • No parameters required
  • fetch_documentation_page - Fetch the content of a documentation page by URL as markdown

    • url (string, required): The URL to fetch content from
  • fetch_document_links - Fetch all links from a documentation page

    • url (string, required): The URL to fetch links from

Prompts

  • documentation_sources

    • List all available documentation sources with their URLs and types
    • No arguments required
  • documentation_page

    • Fetch the full content of a documentation page at a specific URL as markdown
    • Arguments:
      • url (string, required): URL of the specific documentation page to get
  • documentation_links

    • Fetch all links from a documentation page to discover related content
    • Arguments:
      • url (string, required): URL of the documentation page to get links from

Installation

Using uv (recommended)

When using uv no specific installation is needed. We will use uvx to directly run mcp-server-docy.

Using PIP

Alternatively you can install mcp-server-docy via pip:

pip install mcp-server-docy

After installation, you can run it as a script using:

DOCY_DOCUMENTATION_URLS="https://docs.crawl4ai.com/,https://react.dev/" python -m mcp_server_docy

Using Docker

You can also use the Docker image:

docker pull oborchers/mcp-server-docy:latest
docker run -i --rm -e DOCY_DOCUMENTATION_URLS="https://docs.crawl4ai.com/,https://react.dev/" oborchers/mcp-server-docy

Configuration

Configure for Claude.app

Add to your Claude settings:

Using uvx
"mcpServers": {
  "docy": {
    "command": "uvx",
    "args": ["mcp-server-docy"],
    "env": {
      "DOCY_DOCUMENTATION_URLS": "https://docs.crawl4ai.com/,https://react.dev/"
    }
  }
}
Using docker
"mcpServers": {
  "docy": {
    "command": "docker",
    "args": ["run", "-i", "--rm", "oborchers/mcp-server-docy:latest"],
    "env": {
      "DOCY_DOCUMENTATION_URLS": "https://docs.crawl4ai.com/,https://react.dev/"
    }
  }
}
Using pip installation
"mcpServers": {
  "docy": {
    "command": "python",
    "args": ["-m", "mcp_server_docy"],
    "env": {
      "DOCY_DOCUMENTATION_URLS": "https://docs.crawl4ai.com/,https://react.dev/"
    }
  }
}

Configure for VS Code

For manual installation, add the following JSON block to your User Settings (JSON) file in VS Code. You can do this by pressing Ctrl + Shift + P and typing Preferences: Open User Settings (JSON).

Optionally, you can add it to a file called .vscode/mcp.json in your workspace. This will allow you to share the configuration with others.

Note that the mcp key is needed when using the mcp.json file.

Using uvx
{
  "mcp": {
    "servers": {
      "docy": {
        "command": "uvx",
        "args": ["mcp-server-docy"],
        "env": {
          "DOCY_DOCUMENTATION_URLS": "https://docs.crawl4ai.com/,https://react.dev/"
        }
      }
    }
  }
}
Using Docker
{
  "mcp": {
    "servers": {
      "docy": {
        "command": "docker",
        "args": ["run", "-i", "--rm", "oborchers/mcp-server-docy:latest"],
        "env": {
          "DOCY_DOCUMENTATION_URLS": "https://docs.crawl4ai.com/,https://react.dev/"
        }
      }
    }
  }
}

Configuration Options

The application can be configured using environment variables:

  • DOCY_DOCUMENTATION_URLS (string): Comma-separated list of URLs to documentation sites to include (e.g., "https://docs.crawl4ai.com/,https://react.dev/")
  • DOCY_DOCUMENTATION_URLS_FILE (string): Path to a file containing documentation URLs, one per line (default: ".docy.urls")
  • DOCY_CACHE_TTL (integer): Cache time-to-live in seconds (default: 3600)
  • DOCY_USER_AGENT (string): Custom User-Agent string for HTTP requests
  • DOCY_DEBUG (boolean): Enable debug logging ("true", "1", "yes", or "y")
  • DOCY_SKIP_CRAWL4AI_SETUP (boolean): Skip running the crawl4ai-setup command at startup ("true", "1", "yes", or "y")

Environment variables can be set directly or via a .env file.

URL Configuration File

As an alternative to setting the DOCY_DOCUMENTATION_URLS environment variable, you can create a .docy.urls file in your project directory with one URL per line:

https://docs.crawl4ai.com/
https://react.dev/
# Lines starting with # are treated as comments
https://docs.python.org/3/

This approach is especially useful for:

  • Projects where you want to share documentation sources with your team
  • Repositories where storing URLs in version control is beneficial
  • Situations where you want to avoid long environment variable values

The server will first check for URLs in the DOCY_DOCUMENTATION_URLS environment variable, and if none are found, it will look for the .docy.urls file.

Caching Behavior

The MCP server automatically caches documentation content to improve performance:

  • At startup, the server pre-fetches and caches all configured documentation URLs from DOCY_DOCUMENTATION_URLS
  • The cache time-to-live (TTL) can be configured via the DOCY_CACHE_TTL environment variable
  • Each new site accessed is automatically loaded into cache to reduce traffic and improve response times
  • Cached content is stored in memory and persists for the duration of the TTL

This caching strategy minimizes external requests and significantly improves response times for frequently accessed documentation.

Debugging

You can use the MCP inspector to debug the server. For uvx installations:

DOCY_DOCUMENTATION_URLS="https://docs.crawl4ai.com/" npx @modelcontextprotocol/inspector uvx mcp-server-docy

Or if you've installed the package in a specific directory or are developing on it:

cd path/to/docy
DOCY_DOCUMENTATION_URLS="https://docs.crawl4ai.com/" npx @modelcontextprotocol/inspector uv run mcp-server-docy

Release Process

The project uses GitHub Actions for automated releases:

  1. Update the version in pyproject.toml
  2. Create a new tag with git tag vX.Y.Z (e.g., git tag v0.1.0)
  3. Push the tag with git push --tags

This will automatically:

  • Verify the version in pyproject.toml matches the tag
  • Run tests and lint checks
  • Build and publish to PyPI
  • Build and publish to Docker Hub as oborchers/mcp-server-docy:latest and oborchers/mcp-server-docy:X.Y.Z

Contributing

We encourage contributions to help expand and improve mcp-server-docy. Whether you want to add new features, enhance existing functionality, or improve documentation, your input is valuable.

For examples of other MCP servers and implementation patterns, see: https://github.com/modelcontextprotocol/servers

Pull requests are welcome! Feel free to contribute new ideas, bug fixes, or enhancements to make mcp-server-docy even more powerful and useful.

License

mcp-server-docy is licensed under the MIT License. This means you are free to use, modify, and distribute the software, subject to the terms and conditions of the MIT License. For more details, please see the LICENSE file in the project repository.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

mcp_server_docy-0.2.1.tar.gz (3.8 MB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

mcp_server_docy-0.2.1-py3-none-any.whl (11.0 kB view details)

Uploaded Python 3

File details

Details for the file mcp_server_docy-0.2.1.tar.gz.

File metadata

  • Download URL: mcp_server_docy-0.2.1.tar.gz
  • Upload date:
  • Size: 3.8 MB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.1.0 CPython/3.12.9

File hashes

Hashes for mcp_server_docy-0.2.1.tar.gz
Algorithm Hash digest
SHA256 5d7acf29a5b3cb0ae53a348fb5d0b915f47eb234080ac144c9d2cf906eb52e8b
MD5 267a434564300a86eaf122fbdfd3732e
BLAKE2b-256 73d770631268ed99a4b9a38d136fdcd22995a5517529ed2d9a0f87ede1a292c2

See more details on using hashes here.

File details

Details for the file mcp_server_docy-0.2.1-py3-none-any.whl.

File metadata

File hashes

Hashes for mcp_server_docy-0.2.1-py3-none-any.whl
Algorithm Hash digest
SHA256 b55fcf965e591c7c971b79923eec9729f9d316e89f9348c4aba9665c773d4a7d
MD5 b054aa734203bad21eaf60e0cf68d84e
BLAKE2b-256 f8151909504e8eb2b85959bc7f3ffadd5087aaba20f6863f5eb8b19dc1b67cc4

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page