Skip to main content

MCP server giving AI agents access to 100+ LLMs through LiteLLM

Project description

LiteLLM Agent MCP Server

Give your AI agent access to 100+ LLMs.

This MCP server lets AI agents (Claude Code, Cursor, etc.) call any LLM through LiteLLM's unified API. Stop being limited to one model — use the right model for each task.

Why?

AI agents are typically stuck on a single model. With this MCP server, your agent can:

  • 🔀 Call any model — GPT-4, Claude, Gemini, Mistral, and 100+ more
  • ⚖️ Compare outputs — Get responses from multiple models and pick the best
  • 🎯 Use the right tool — Code tasks → GPT-4, writing → Claude, long docs → Gemini
  • 💰 Save costs — Route simple queries to cheaper models

Tools

Tool Description
call Call any LLM model with a prompt
compare Compare responses from multiple models
models List available models and their strengths
recommend Get model recommendation for a task type

Installation

Claude Desktop / Cursor

Add to your MCP config:

{
  "mcpServers": {
    "litellm": {
      "command": "python",
      "args": ["-m", "litellm_agent_mcp"],
      "env": {
        "OPENAI_API_KEY": "sk-...",
        "ANTHROPIC_API_KEY": "sk-..."
      }
    }
  }
}

From PyPI

pip install litellm-agent-mcp

From Source

git clone https://github.com/shin-bot-litellm/litellm-agent-mcp
cd litellm-agent-mcp
pip install -e .

Usage Examples

Call a specific model

Use the `call` tool:
- model: "gpt-4o"  
- prompt: "Explain this code: [code here]"

Compare multiple models

Use the `compare` tool:
- models: ["gpt-4o", "claude-sonnet-4-20250514"]
- prompt: "What's the best approach to implement caching?"

Get a recommendation

Use the `recommend` tool:
- task_type: "code"

→ Returns: gpt-4o (Strong at code generation, debugging, and review)

Environment Variables

Set API keys for the providers you want to use:

OPENAI_API_KEY=sk-...
ANTHROPIC_API_KEY=sk-...
GEMINI_API_KEY=...
MISTRAL_API_KEY=...

Or point to a LiteLLM proxy:

LITELLM_API_BASE=https://your-proxy.com
LITELLM_API_KEY=sk-...

Supported Models

Provider Models
OpenAI gpt-4o, gpt-4o-mini, o1-preview, o1-mini
Anthropic claude-sonnet-4, claude-opus-4
Google gemini-1.5-pro, gemini-1.5-flash
Mistral mistral-large-latest
+ 100 more See LiteLLM docs

License

MIT

Links

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

litellm_mcp-1.0.0.tar.gz (5.0 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

litellm_mcp-1.0.0-py3-none-any.whl (5.5 kB view details)

Uploaded Python 3

File details

Details for the file litellm_mcp-1.0.0.tar.gz.

File metadata

  • Download URL: litellm_mcp-1.0.0.tar.gz
  • Upload date:
  • Size: 5.0 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.11.2

File hashes

Hashes for litellm_mcp-1.0.0.tar.gz
Algorithm Hash digest
SHA256 cce72d8c4a6456a943b21dc78fcbc5c1d1bc4321c871c72b3674143ff31c3963
MD5 ee360699882a344c15965507784ec822
BLAKE2b-256 c233a8fc5b1bcc82948a609e6f77c457b9c859d3f5432982150fc223628d19b9

See more details on using hashes here.

File details

Details for the file litellm_mcp-1.0.0-py3-none-any.whl.

File metadata

  • Download URL: litellm_mcp-1.0.0-py3-none-any.whl
  • Upload date:
  • Size: 5.5 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.11.2

File hashes

Hashes for litellm_mcp-1.0.0-py3-none-any.whl
Algorithm Hash digest
SHA256 b13a06351a1e5bbb19a123048df508227e4e50a6c0c4e39b3216518e832de355
MD5 045ee002737b48a6340efef4f36ea6ed
BLAKE2b-256 d59e62498fc44883d85fc982247af7d18e138c4ad0bdf2a7c81b41b1812cea6d

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page