Skip to main content

A minimal CLI for interacting with LLMs via LiteLLM and MCP

Project description

llmcp

A minimal CLI for interacting with LLMs via LiteLLM and MCP.

Installation

# Install with pip
pip install llmcp

# Or with uv (recommended)
uv pip install llmcp

Quickstart

# Install (recommended)
uv pip install llmcp

# Search for available models
llmcp search 'gemini-2*'

# Serve mcp server
llmcp serve gemini-2.5-pro-exp-03-25

# Test calling a model via mcp server (make sure your API key is set)
llmcp test gpt-4o-mini "What is the capital of France?"

Example output:

Response:
The capital of France is Paris.

MCP details:

  • MCP server implements a minimalist ask tool. The ask tool schema is as follows:
    {
      "name": "ask",
      "description": "Send a prompt to the {model_name} model and get a response.",
      "input_schema": {
        "type": "object",
        "properties": {
          "prompt": {
            "type": "string",
            "description": "The prompt to send to the {model_name} model."
          }
        },
        "required": ["prompt"]
      }
    }
    
  • Maximum token usage by default (automatically uses the maximum available tokens for the model)

Configuration

Set up your API keys as environment variables:

Provider Environment Variable
OpenAI OPENAI_API_KEY
Anthropic ANTHROPIC_API_KEY
Gemini GEMINI_API_KEY
Mistral MISTRAL_API_KEY
Cohere COHERE_API_KEY
Groq GROQ_API_KEY

Example:

export OPENAI_API_KEY="sk-..."
export ANTHROPIC_API_KEY="sk-..."
export GEMINI_API_KEY="..."
export MISTRAL_API_KEY="..."
export COHERE_API_KEY="..."
export GROQ_API_KEY="..."

Acknowledgements

This tool relies on the following libraries:

  • LiteLLM for interacting with various LLM APIs.
  • MCP Python SDK for implementing the Model Context Protocol.

License

MIT

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

llmcp-0.1.1.tar.gz (206.8 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

llmcp-0.1.1-py3-none-any.whl (8.0 kB view details)

Uploaded Python 3

File details

Details for the file llmcp-0.1.1.tar.gz.

File metadata

  • Download URL: llmcp-0.1.1.tar.gz
  • Upload date:
  • Size: 206.8 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: uv/0.5.17

File hashes

Hashes for llmcp-0.1.1.tar.gz
Algorithm Hash digest
SHA256 4067c2571c17fb83101626da3ade490c3fbef3fa3d7f7811f0d07cf881520d7b
MD5 3087524e4daad30caa3acf6a7b2017b9
BLAKE2b-256 4f0bf9f203138ad957f410a2c0a15d4996bab2405af151e0fd98508b4d49e138

See more details on using hashes here.

File details

Details for the file llmcp-0.1.1-py3-none-any.whl.

File metadata

  • Download URL: llmcp-0.1.1-py3-none-any.whl
  • Upload date:
  • Size: 8.0 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: uv/0.5.17

File hashes

Hashes for llmcp-0.1.1-py3-none-any.whl
Algorithm Hash digest
SHA256 fcc836215a84d974a5b22013392647fdc58e6b8311c13286eb9c61f5e239b5e5
MD5 c8b20582ebf0e90f251d5185db84776d
BLAKE2b-256 23914e99104e1b85c37c3074f48ad3d83cb41b1ce11ae2c8fd6175308db67880

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page