Skip to main content

A Model Control Protocol (MCP) server that allows cross-checking responses from multiple LLM providers simultaneously.

Project description

MseeP.ai Security Assessment Badge

Multi LLM Cross-Check MCP Server

smithery badge A Model Control Protocol (MCP) server that allows cross-checking responses from multiple LLM providers simultaneously. This server integrates with Claude Desktop as an MCP server to provide a unified interface for querying different LLM APIs.

Features

  • Query multiple LLM providers in parallel
  • Currently supports:
    • OpenAI (ChatGPT)
    • Anthropic (Claude)
    • Perplexity AI
    • Google (Gemini)
  • Asynchronous parallel processing for faster responses
  • Easy integration with Claude Desktop

Prerequisites

  • Python 3.8 or higher
  • API keys for the LLM providers you want to use
  • uv package manager (install with pip install uv)

Installation

Installing via Smithery

To install Multi LLM Cross-Check Server for Claude Desktop automatically via Smithery:

npx -y @smithery/cli install @lior-ps/multi-llm-cross-check-mcp-server --client claude

Manual Installation

  1. Clone this repository:
git clone https://github.com/lior-ps/multi-llm-cross-check-mcp-server.git
cd multi-llm-cross-check-mcp-server
  1. Initialize uv environment and install requirements:
uv venv
uv pip install -r requirements.txt
  1. Configure in Claude Desktop: Create a file named claude_desktop_config.json in your Claude Desktop configuration directory with the following content:

    {
      "mcp_servers": [
        {
          "command": "uv",
          "args": [
            "--directory",
            "/multi-llm-cross-check-mcp-server",
            "run",
            "main.py"
          ],
          "env": {
            "OPENAI_API_KEY": "your_openai_key",  // Get from https://platform.openai.com/api-keys
            "ANTHROPIC_API_KEY": "your_anthropic_key",  // Get from https://console.anthropic.com/account/keys
            "PERPLEXITY_API_KEY": "your_perplexity_key",  // Get from https://www.perplexity.ai/settings/api
            "GEMINI_API_KEY": "your_gemini_key"  // Get from https://makersuite.google.com/app/apikey
          }
        }
      ]
    }
    

    Notes:

    1. You only need to add the API keys for the LLM providers you want to use. The server will skip any providers without configured API keys.
    2. You may need to put the full path to the uv executable in the command field. You can get this by running which uv on MacOS/Linux or where uv on Windows.

Using the MCP Server

Once configured:

  1. The server will automatically start when you open Claude Desktop
  2. You can use the cross_check tool in your conversations by asking to "cross check with other LLMs"
  3. Provide a prompt, and it will return responses from all configured LLM providers

API Response Format

The server returns a dictionary with responses from each LLM provider:

{
    "ChatGPT": { ... },
    "Claude": { ... },
    "Perplexity": { ... },
    "Gemini": { ... }
}

Error Handling

  • If an API key is not provided for a specific LLM, that provider will be skipped
  • API errors are caught and returned in the response
  • Each LLM's response is independent, so errors with one provider won't affect others

Verified on MseeP

Contributing

Contributions are welcome! Please feel free to submit a Pull Request.

License

This project is licensed under the MIT License - see the LICENSE file for details.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

File details

Details for the file iflow_mcp_lior_ps_multi_llm_cross_check_mcp_server-0.1.0.tar.gz.

File metadata

  • Download URL: iflow_mcp_lior_ps_multi_llm_cross_check_mcp_server-0.1.0.tar.gz
  • Upload date:
  • Size: 5.7 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: uv/0.9.28 {"installer":{"name":"uv","version":"0.9.28","subcommand":["publish"]},"python":null,"implementation":{"name":null,"version":null},"distro":{"name":"Debian GNU/Linux","version":"13","id":"trixie","libc":null},"system":{"name":null,"release":null},"cpu":null,"openssl_version":null,"setuptools_version":null,"rustc_version":null,"ci":null}

File hashes

Hashes for iflow_mcp_lior_ps_multi_llm_cross_check_mcp_server-0.1.0.tar.gz
Algorithm Hash digest
SHA256 18d4a6dcd0c3c78316d9857eec3ce5028472b26e5420c5232e5ed31d7f5345aa
MD5 0f6caeaa190f03be8fd8b392b80d4eea
BLAKE2b-256 2a4cd20a8ec6d23ac803376e253e0d9efcbb61ec6237fdc21db755c323d7a28f

See more details on using hashes here.

File details

Details for the file iflow_mcp_lior_ps_multi_llm_cross_check_mcp_server-0.1.0-py3-none-any.whl.

File metadata

  • Download URL: iflow_mcp_lior_ps_multi_llm_cross_check_mcp_server-0.1.0-py3-none-any.whl
  • Upload date:
  • Size: 18.3 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: uv/0.9.28 {"installer":{"name":"uv","version":"0.9.28","subcommand":["publish"]},"python":null,"implementation":{"name":null,"version":null},"distro":{"name":"Debian GNU/Linux","version":"13","id":"trixie","libc":null},"system":{"name":null,"release":null},"cpu":null,"openssl_version":null,"setuptools_version":null,"rustc_version":null,"ci":null}

File hashes

Hashes for iflow_mcp_lior_ps_multi_llm_cross_check_mcp_server-0.1.0-py3-none-any.whl
Algorithm Hash digest
SHA256 3fa273d3dcd9b7f020861ab4f27b3df5063659be36b0c9db725b60c8cc01b2d6
MD5 78b63733983a2264828cc98d8fba760a
BLAKE2b-256 6a52e8bdda943326872c57d03a5f7d12da09e5d5f7c36449130a63217d37fb09

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page