Skip to main content

A minimal CLI for interacting with LLMs via LiteLLM and MCP

Project description

llmcp

A minimal CLI for interacting with LLMs via LiteLLM and MCP.

Installation

# Install with pip
pip install llmcp

# Or with uv (recommended)
uv pip install llmcp

Quickstart

# Install and test locally (recommended)
uv pip install llmcp

# Search for available models
llmcp search 'gemini-2*'

# Serve mcp server
# llmcp serve gemini-2.5-pro-exp-03-25

# Test start a mcp server and call it
llmcp test gpt-4o-mini "What is the capital of France?"
# Example output:
# Response:
# The capital of France is Paris.

# Add a gemini:ask tool to Claude Code
claude mcp add gemini uvx llmcp serve gemini-2.5-pro-exp-03-25

MCP details:

  • MCP server implements a minimalist ask tool. The ask tool schema is as follows:
    {
      "name": "ask",
      "description": "Send a prompt to the {model_name} model and get a response.",
      "input_schema": {
        "type": "object",
        "properties": {
          "prompt": {
            "type": "string",
            "description": "The prompt to send to the {model_name} model."
          }
        },
        "required": ["prompt"]
      }
    }
    
  • Maximum token usage by default (automatically uses the maximum available tokens for the model)

Configuration

Set up your API keys as environment variables:

Provider Environment Variable
OpenAI OPENAI_API_KEY
Anthropic ANTHROPIC_API_KEY
Gemini GEMINI_API_KEY
Mistral MISTRAL_API_KEY
Cohere COHERE_API_KEY
Groq GROQ_API_KEY

Example:

export OPENAI_API_KEY="sk-..."
export ANTHROPIC_API_KEY="sk-..."
export GEMINI_API_KEY="..."
export MISTRAL_API_KEY="..."
export COHERE_API_KEY="..."
export GROQ_API_KEY="..."

Acknowledgements

This tool relies on the following libraries:

  • LiteLLM for interacting with various LLM APIs.
  • MCP Python SDK for implementing the Model Context Protocol.

License

MIT

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

llmcp-0.1.2.tar.gz (206.8 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

llmcp-0.1.2-py3-none-any.whl (8.0 kB view details)

Uploaded Python 3

File details

Details for the file llmcp-0.1.2.tar.gz.

File metadata

  • Download URL: llmcp-0.1.2.tar.gz
  • Upload date:
  • Size: 206.8 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: uv/0.5.17

File hashes

Hashes for llmcp-0.1.2.tar.gz
Algorithm Hash digest
SHA256 aa16911b8262d65098c1ed186858e1c32c30af5fc1149ec55d8c09e4c4d9d172
MD5 e1920a7a757333ba698643f41e15530c
BLAKE2b-256 5344abdd5ea60426312ad8879f79fe1bb96145aa314ba0d614b69e030ba937b5

See more details on using hashes here.

File details

Details for the file llmcp-0.1.2-py3-none-any.whl.

File metadata

  • Download URL: llmcp-0.1.2-py3-none-any.whl
  • Upload date:
  • Size: 8.0 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: uv/0.5.17

File hashes

Hashes for llmcp-0.1.2-py3-none-any.whl
Algorithm Hash digest
SHA256 d4d56eda444509978e9d0deb3e11c831f6e6cbb9096c1b5738381d9524f73133
MD5 4d6e1def33df27aa3ac0741f7c960dfc
BLAKE2b-256 f2a5f662482873c2d5facadcec5e7616ffba772a5d708b02072e1418f504a33c

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page