Skip to main content

Multi-provider web search and extraction MCP server with intelligent auto-routing, quality reports, and research mode

Project description

🔍 web-search-plus-mcp

PyPI version Python 3.10+ MCP License: MIT Glama

Multi-provider web search and extraction MCP server with intelligent auto-routing.

web-search-plus-mcp is the standalone MCP packaging of Web Search Plus. It gives Claude Desktop, NanoBot, Cursor, and other MCP-compatible hosts access to the same Python routing engine family used by the Hermes/OpenClaw Web Search Plus tools.

✨ Features

  • 10 search providers — Serper, Brave, Tavily, Exa, Querit, Linkup, Firecrawl, Perplexity, You.com, SearXNG
  • 5 extract providers — Firecrawl, Linkup, Tavily, Exa, You.com
  • Intelligent auto-routing — scores query intent and picks a provider automatically
  • Quality reports — optional routing/result diagnostics
  • Research mode — opt-in multi-provider search + top-source extraction with a time budget
  • Zero-install runuvx web-search-plus-mcp
  • MCP-native — stdio server exposing web_search and web_extract

🚀 Quick Start

# Run instantly with uvx
uvx web-search-plus-mcp

# Or install globally
pip install web-search-plus-mcp
web-search-plus-mcp

At least one provider credential is required for search. Extraction needs at least one extraction-capable provider key.

⚙️ Claude Desktop Config

Add to ~/Library/Application Support/Claude/claude_desktop_config.json on macOS or %APPDATA%\\Claude\\claude_desktop_config.json on Windows:

{
  "mcpServers": {
    "web-search-plus": {
      "command": "uvx",
      "args": ["web-search-plus-mcp"],
      "env": {
        "LINKUP_API_KEY": "your_linkup_key",
        "TAVILY_API_KEY": "your_tavily_key",
        "EXA_API_KEY": "your_exa_key",
        "FIRECRAWL_API_KEY": "your_firecrawl_key",
        "BRAVE_API_KEY": "your_brave_key",
        "SERPER_API_KEY": "your_serper_key",
        "QUERIT_API_KEY": "your_querit_key",
        "PERPLEXITY_API_KEY": "your_perplexity_key",
        "YOU_API_KEY": "your_you_key",
        "SEARXNG_INSTANCE_URL": "https://your-searxng-instance.example.com"
      }
    }
  }
}

You can also place a .env file next to the package/project with the same variables.

🔎 Search Providers

  • Serper — Google-style facts, news, shopping, local queries
  • Brave — general-purpose independent web index
  • Tavily — research and analysis
  • Exa — semantic discovery, similarity, deep/deep-reasoning synthesis
  • Querit — multilingual, real-time AI search
  • Linkup — source-backed grounding/citations
  • Firecrawl — web search plus scrape-ready content
  • Perplexity — direct synthesized answers
  • You.com — LLM-ready real-time snippets
  • SearXNG — privacy-first self-hosted meta-search

📄 Extract Providers

  • Linkup — recommended first choice for clean markdown and low cost
  • Firecrawl — robust scrape fallback, useful for JS-heavy/blocked pages
  • Tavily — extraction/content API
  • Exa — contents API
  • You.com — LLM-ready snippets/content where available

🛠 MCP Tool Reference

web_search

Parameters:

  • query — required search query
  • providerauto, serper, brave, tavily, exa, querit, linkup, firecrawl, perplexity, you, searxng
  • count — results to return, default 5, max 20
  • depth — Exa depth: normal, deep, deep-reasoning
  • time_rangehour, day, week, month, year
  • include_domains / exclude_domains — domain allow/deny lists
  • modenormal or research
  • quality_report — include routing/result diagnostics
  • research_time_budget — best-effort wall-clock budget for research mode

Example MCP arguments:

{
  "query": "latest Hermes Agent release",
  "provider": "linkup",
  "count": 5,
  "quality_report": true
}

web_extract

Parameters:

  • urls — required list of URLs
  • providerauto, firecrawl, linkup, tavily, exa, you
  • formatmarkdown or html
  • include_images — include image metadata when supported
  • include_raw_html — include raw HTML when supported
  • render_js — render JavaScript before extraction when supported

Example MCP arguments:

{
  "urls": ["https://example.com"],
  "provider": "linkup",
  "format": "markdown"
}

🧠 Auto-Routing Examples

  • iPhone 16 Pro price → Serper/Brave shopping-style search
  • how does TCP/IP work → Tavily research-style search
  • latest multilingual EV market updates → Querit/Linkup real-time/source-backed search
  • companies like Stripe → Exa discovery search
  • what is quantum computing → Perplexity/You.com direct-answer style search
  • privacy focused search results → SearXNG when configured

Credits

Built on the Web Search Plus routing logic originally developed for OpenClaw/Clawhub and later ported to Hermes as hermes-web-search-plus.

License

MIT © 2026 robbyczgw-cla

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

web_search_plus_mcp-0.2.0.tar.gz (86.6 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

web_search_plus_mcp-0.2.0-py3-none-any.whl (42.0 kB view details)

Uploaded Python 3

File details

Details for the file web_search_plus_mcp-0.2.0.tar.gz.

File metadata

  • Download URL: web_search_plus_mcp-0.2.0.tar.gz
  • Upload date:
  • Size: 86.6 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.11.15

File hashes

Hashes for web_search_plus_mcp-0.2.0.tar.gz
Algorithm Hash digest
SHA256 f7d3baa0449f629afd7c62ba027dd59c3dd4b6d69efb24fc3f400f9a88e2da40
MD5 39cf6ef310a199bff836d2554528eceb
BLAKE2b-256 fa3e21d3ac93e7afec1bbd65e0c5f3722688448c7c1823da7f85dde82da9ffc8

See more details on using hashes here.

File details

Details for the file web_search_plus_mcp-0.2.0-py3-none-any.whl.

File metadata

File hashes

Hashes for web_search_plus_mcp-0.2.0-py3-none-any.whl
Algorithm Hash digest
SHA256 af031b4c66b0e08533dffc8936c40f3adb7bc265849daefe5db35ccfc32badd2
MD5 c46fce3d504d53bdb304dd0e1acb70a7
BLAKE2b-256 4f74c8616f4628d4a770e8cc8f88866c1b300a1bdabade9520511822e24e7e6e

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page