Skip to main content

A simple MCP server that provides a unified interface to various LLM providers using Pydantic AI

Project description

LLM Bridge MCP

LLM Bridge MCP allows AI agents to interact with multiple large language models through a standardized interface. It leverages the Message Control Protocol (MCP) to provide seamless access to different LLM providers, making it easy to switch between models or use multiple models in the same application.

Features

  • Unified interface to multiple LLM providers:
    • OpenAI (GPT models)
    • Anthropic (Claude models)
    • Google (Gemini models)
    • DeepSeek
    • ...
  • Built with Pydantic AI for type safety and validation
  • Supports customizable parameters like temperature and max tokens
  • Provides usage tracking and metrics

Tools

The server implements the following tool:

run_llm(
    prompt: str,
    model_name: KnownModelName = "openai:gpt-4o-mini",
    temperature: float = 0.7,
    max_tokens: int = 8192,
    system_prompt: str = "",
) -> LLMResponse
  • prompt: The text prompt to send to the LLM
  • model_name: Specific model to use (default: "openai:gpt-4o-mini")
  • temperature: Controls randomness (0.0 to 1.0)
  • max_tokens: Maximum number of tokens to generate
  • system_prompt: Optional system prompt to guide the model's behavior

Installation

  1. Clone the repository:
git clone https://github.com/yourusername/llm-bridge-mcp.git
cd llm-bridge-mcp
  1. Install uv (if not already installed):
# On macOS and Linux
curl -LsSf https://astral.sh/uv/install.sh | sh

# On Windows
powershell -ExecutionPolicy ByPass -c "irm https://astral.sh/uv/install.ps1 | iex"

Configuration

Create a .env file in the root directory with your API keys:

OPENAI_API_KEY=your_openai_api_key
ANTHROPIC_API_KEY=your_anthropic_api_key
GOOGLE_API_KEY=your_google_api_key
DEEPSEEK_API_KEY=your_deepseek_api_key

Usage

Using with Claude Desktop or Cursor

Add a server entry to your Claude Desktop configuration file or .cursor/mcp.json:

"mcpServers": {
  "llm-bridge": {
    "command": "uvx",
    "args": [
      "llm-bridge"
    ],
    "env": {
      "OPENAI_API_KEY": "your_openai_api_key",
      "ANTHROPIC_API_KEY": "your_anthropic_api_key",
      "GOOGLE_API_KEY": "your_google_api_key",
      "DEEPSEEK_API_KEY": "your_deepseek_api_key"
    }
  }
}

License

This project is licensed under the MIT License - see the LICENSE file for details.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

llm_bridge_mcp-0.1.0.tar.gz (45.1 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

llm_bridge_mcp-0.1.0-py3-none-any.whl (4.3 kB view details)

Uploaded Python 3

File details

Details for the file llm_bridge_mcp-0.1.0.tar.gz.

File metadata

  • Download URL: llm_bridge_mcp-0.1.0.tar.gz
  • Upload date:
  • Size: 45.1 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: uv/0.6.6

File hashes

Hashes for llm_bridge_mcp-0.1.0.tar.gz
Algorithm Hash digest
SHA256 f9f27f16bc2445c11453a80fdc3aa7bfcbf2aeed489354686150746e2ead6739
MD5 55f006224dfd2df6b7824b7118a2d42c
BLAKE2b-256 7eea25fc67d7b2bfe61cfd3f115b3beff30c31d1e94dcb770b06ca86312a2c21

See more details on using hashes here.

File details

Details for the file llm_bridge_mcp-0.1.0-py3-none-any.whl.

File metadata

File hashes

Hashes for llm_bridge_mcp-0.1.0-py3-none-any.whl
Algorithm Hash digest
SHA256 8dd202c83e8573071e3c18c8731d60e0cb688005839d92f2138bd4934db3d9f3
MD5 010456f6cea15cf0fcbe068bd8090c0b
BLAKE2b-256 2358f07b9de10f8bd73173fd958070db3d2c5e4f55e7ef2674de6e5c05e52aab

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page