Skip to main content

A simple MCP server that provides a unified interface to various LLM providers using Pydantic AI

Project description

LLM Bridge MCP

smithery badge

LLM Bridge MCP allows AI agents to interact with multiple large language models through a standardized interface. It leverages the Message Control Protocol (MCP) to provide seamless access to different LLM providers, making it easy to switch between models or use multiple models in the same application.

Features

  • Unified interface to multiple LLM providers:
    • OpenAI (GPT models)
    • Anthropic (Claude models)
    • Google (Gemini models)
    • DeepSeek
    • ...
  • Built with Pydantic AI for type safety and validation
  • Supports customizable parameters like temperature and max tokens
  • Provides usage tracking and metrics

Tools

The server implements the following tool:

run_llm(
    prompt: str,
    model_name: KnownModelName = "openai:gpt-4o-mini",
    temperature: float = 0.7,
    max_tokens: int = 8192,
    system_prompt: str = "",
) -> LLMResponse
  • prompt: The text prompt to send to the LLM
  • model_name: Specific model to use (default: "openai:gpt-4o-mini")
  • temperature: Controls randomness (0.0 to 1.0)
  • max_tokens: Maximum number of tokens to generate
  • system_prompt: Optional system prompt to guide the model's behavior

Installation

Installing via Smithery

To install llm-bridge-mcp for Claude Desktop automatically via Smithery:

npx -y @smithery/cli install @sjquant/llm-bridge-mcp --client claude

Manual Installation

  1. Clone the repository:
git clone https://github.com/yourusername/llm-bridge-mcp.git
cd llm-bridge-mcp
  1. Install uv (if not already installed):
# On macOS
brew install uv

# On Linux
curl -LsSf https://astral.sh/uv/install.sh | sh

# On Windows
powershell -ExecutionPolicy ByPass -c "irm https://astral.sh/uv/install.ps1 | iex"

Configuration

Create a .env file in the root directory with your API keys:

OPENAI_API_KEY=your_openai_api_key
ANTHROPIC_API_KEY=your_anthropic_api_key
GOOGLE_API_KEY=your_google_api_key
DEEPSEEK_API_KEY=your_deepseek_api_key

Usage

Using with Claude Desktop or Cursor

Add a server entry to your Claude Desktop configuration file or .cursor/mcp.json:

"mcpServers": {
  "llm-bridge": {
    "command": "uvx",
    "args": [
      "llm-bridge-mcp"
    ],
    "env": {
      "OPENAI_API_KEY": "your_openai_api_key",
      "ANTHROPIC_API_KEY": "your_anthropic_api_key",
      "GOOGLE_API_KEY": "your_google_api_key",
      "DEEPSEEK_API_KEY": "your_deepseek_api_key"
    }
  }
}

Troubleshooting

Common Issues

1. "spawn uvx ENOENT" Error

This error occurs when the system cannot find the uvx executable in your PATH. To resolve this:

Solution: Use the full path to uvx

Find the full path to your uvx executable:

# On macOS/Linux
which uvx

# On Windows
where.exe uvx

Then update your MCP server configuration to use the full path:

"mcpServers": {
  "llm-bridge": {
    "command": "/full/path/to/uvx",  // Replace with your actual path
    "args": [
      "llm-bridge-mcp"
    ],
    "env": {
      // ... your environment variables
    }
  }
}

License

This project is licensed under the MIT License - see the LICENSE file for details.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

llm_bridge_mcp-0.1.2.tar.gz (46.6 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

llm_bridge_mcp-0.1.2-py3-none-any.whl (4.7 kB view details)

Uploaded Python 3

File details

Details for the file llm_bridge_mcp-0.1.2.tar.gz.

File metadata

  • Download URL: llm_bridge_mcp-0.1.2.tar.gz
  • Upload date:
  • Size: 46.6 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: uv/0.6.10

File hashes

Hashes for llm_bridge_mcp-0.1.2.tar.gz
Algorithm Hash digest
SHA256 d96d54e93ecbb36f32c68b5cfaf6bd58767e9051cf0f0fbd2800fe802c2ebfd8
MD5 3520c26a078c90ff81bb0fa6f0502dff
BLAKE2b-256 6a4c5d1a9c36755d833b415d713045ff1e04b6f57c358a14a0ffbb3d3a215624

See more details on using hashes here.

File details

Details for the file llm_bridge_mcp-0.1.2-py3-none-any.whl.

File metadata

File hashes

Hashes for llm_bridge_mcp-0.1.2-py3-none-any.whl
Algorithm Hash digest
SHA256 e4c5c313f91581f465bdcf13cc276aa6abc58105e2c8eb72cc71e115016cb51d
MD5 d08f2d9ef8aab125260e8c743b8bf37a
BLAKE2b-256 bf3263ca28159c46cd4fc50a69a8c786b2c64b3fafea0bcf84f750318532b06c

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page