Skip to main content

Command line interface for MCP client

Project description

MCP CLI client with GigaChat Support

GigaChat configuration

cp .env.example .env

Configure your credentials to access GigaChat. Check https://github.com/ai-forever/gigachat to read about auth methods.

Also check, that GigaChat used in mcp-server-config.json:

  "llm": {
    "provider": "gigachat",
    "model": "GigaChat-2-Max",
    "verify_ssl_certs": false,
    "base_url": "https://gigachat.sberdevices.ru/v1",
    "top_p": 0
  }

MCP server console expirements

  1. Run MCP server, for exmple:

python mcp_server.py

  1. Type first command to activate server (direct in console):

{"jsonrpc":"2.0","id":0,"result":{"protocolVersion":"2024-11-05","capabilities":{"experimental":{},"prompts":{"listChanged":false},"resources":{"subscribe":false,"listChanged":false},"tools":{"listChanged":false}},"serverInfo":{"name":"Math","version":"1.6.0"}}}

Answer:

{"jsonrpc":"2.0","id":0,"result":{"protocolVersion":"2024-11-05","capabilities":{"experimental":{},"prompts":{"listChanged":false},"resources":{"subscribe":false,"listChanged":false},"tools":{"listChanged":false}},"serverInfo":{"name":"Math","version":"1.6.0"}}}

  1. Type command that connection was initialized:

{"method":"notifications/initialized","jsonrpc":"2.0"}

Answer:

No answer

  1. Send request to list awailable tools:

{"method":"tools/list","jsonrpc":"2.0","id":1}

Answer:

    "jsonrpc": "2.0",
    "id": 1,
    "result": {
        "tools": [
            {
                "name": "find_preson",
                "description": "Find an info about some person by name",
                "inputSchema": {
                    "$defs": {
                        "Query": {
                            "properties": {
                                "query": {
                                    "title": "Query",
                                    "type": "string"
                                }
                            },
                            "required": [
                                "query"
                            ],
                            "title": "Query",
                            "type": "object"
                        }
                    },
                    "properties": {
                        "name": {
                            "$ref": "#/$defs/Query"
                        }
                    },
                    "required": [
                        "name"
                    ],
                    "title": "find_presonArguments",
                    "type": "object"
                }
            },
            {
                "name": "add",
                "description": "Add two numbers",
                "inputSchema": {
                    "properties": {
                        "a": {
                            "title": "A",
                            "type": "number"
                        },
                        "b": {
                            "title": "B",
                            "type": "number"
                        }
                    },
                    "required": [
                        "a",
                        "b"
                    ],
                    "title": "addArguments",
                    "type": "object"
                }
            },
            {
                "name": "multiply",
                "description": "Multiply two numbers",
                "inputSchema": {
                    "properties": {
                        "a": {
                            "title": "A",
                            "type": "number"
                        },
                        "b": {
                            "title": "B",
                            "type": "number"
                        }
                    },
                    "required": [
                        "a",
                        "b"
                    ],
                    "title": "multiplyArguments",
                    "type": "object"
                }
            }
        ]
    }
}

Original readme:

A simple CLI program to run LLM prompt and implement Model Context Protocol (MCP) client.

You can use any MCP-compatible servers from the convenience of your terminal.

This act as alternative client beside Claude Desktop. Additionally you can use any LLM provider like OpenAI, Groq, or local LLM model via llama.

C4 Diagram

Setup

  1. Install via pip:

    pip install mcp-client-cli
    
  2. Create a ~/.llm/config.json file to configure your LLM and MCP servers:

    {
      "systemPrompt": "You are an AI assistant helping a software engineer...",
      "llm": {
        "provider": "openai",
        "model": "gpt-4",
        "api_key": "your-openai-api-key",
        "temperature": 0.7,
        "base_url": "https://api.openai.com/v1"  // Optional, for OpenRouter or other providers
      },
      "mcpServers": {
        "fetch": {
          "command": "uvx",
          "args": ["mcp-server-fetch"],
          "requires_confirmation": ["fetch"],
          "enabled": true,  // Optional, defaults to true
          "exclude_tools": []  // Optional, list of tool names to exclude
        },
        "brave-search": {
          "command": "npx",
          "args": ["-y", "@modelcontextprotocol/server-brave-search"],
          "env": {
            "BRAVE_API_KEY": "your-brave-api-key"
          },
          "requires_confirmation": ["brave_web_search"]
        },
        "youtube": {
          "command": "uvx",
          "args": ["--from", "git+https://github.com/adhikasp/mcp-youtube", "mcp-youtube"]
        }
      }
    }
    

    Note:

    • See CONFIG.md for complete documentation of the configuration format
    • Use requires_confirmation to specify which tools need user confirmation before execution
    • The LLM API key can also be set via environment variables LLM_API_KEY or OPENAI_API_KEY
    • The config file can be placed in either ~/.llm/config.json or $PWD/.llm/config.json
    • You can comment the JSON config file with // if you like to switch around the configuration
  3. Run the CLI:

    llm "What is the capital city of North Sumatra?"
    

Usage

Basic Usage

$ llm What is the capital city of North Sumatra?
The capital city of North Sumatra is Medan.

You can omit the quotes, but be careful with bash special characters like &, |, ; that might be interpreted by your shell.

You can also pipe input from other commands or files:

$ echo "What is the capital city of North Sumatra?" | llm
The capital city of North Sumatra is Medan.

$ echo "Given a location, tell me its capital city." > instructions.txt
$ cat instruction.txt | llm "West Java"
The capital city of West Java is Bandung.

Image Input

You can pipe image files to analyze them with multimodal LLMs:

$ cat image.jpg | llm "What do you see in this image?"
[LLM will analyze and describe the image]

$ cat screenshot.png | llm "Is there any error in this screenshot?"
[LLM will analyze the screenshot and point out any errors]

Using Prompt Templates

You can use predefined prompt templates by using the p prefix followed by the template name and its arguments:

# List available prompt templates
$ llm --list-prompts

# Use a template
$ llm p review  # Review git changes
$ llm p commit  # Generate commit message
$ llm p yt url=https://youtube.com/...  # Summarize YouTube video

Triggering a tool

$ llm What is the top article on hackernews today?

================================== Ai Message ==================================
Tool Calls:
  brave_web_search (call_eXmFQizLUp8TKBgPtgFo71et)
 Call ID: call_eXmFQizLUp8TKBgPtgFo71et
  Args:
    query: site:news.ycombinator.com
    count: 1
Brave Search MCP Server running on stdio

# If the tool requires confirmation, you'll be prompted:
Confirm tool call? [y/n]: y

================================== Ai Message ==================================
Tool Calls:
  fetch (call_xH32S0QKqMfudgN1ZGV6vH1P)
 Call ID: call_xH32S0QKqMfudgN1ZGV6vH1P
  Args:
    url: https://news.ycombinator.com/
================================= Tool Message =================================
Name: fetch

[TextContent(type='text', text='Contents [REDACTED]]
================================== Ai Message ==================================

The top article on Hacker News today is:

### [Why pipes sometimes get "stuck": buffering](https://jvns.ca)
- **Points:** 31
- **Posted by:** tanelpoder
- **Posted:** 1 hour ago

You can view the full list of articles on [Hacker News](https://news.ycombinator.com/)

To bypass tool confirmation requirements, use the --no-confirmations flag:

$ llm --no-confirmations "What is the top article on hackernews today?"

To use in bash scripts, add the --no-intermediates, so it doesn't print intermediate messages, only the concluding end message.

$ llm --no-intermediates "What is the time in Tokyo right now?"

Continuation

Add a c prefix to your message to continue the last conversation.

$ llm asldkfjasdfkl
It seems like your message might have been a typo or an error. Could you please clarify or provide more details about what you need help with?
$ llm c what did i say previously?
You previously typed "asldkfjasdfkl," which appears to be a random string of characters. If you meant to ask something specific or if you have a question, please let me know!

Clipboard Support

You can use content from your clipboard using the cb command:

# After copying text to clipboard
$ llm cb
[LLM will process the clipboard text]

$ llm cb "What language is this code written in?"
[LLM will analyze the clipboard text with your question]

# After copying an image to clipboard
$ llm cb "What do you see in this image?"
[LLM will analyze the clipboard image]

# You can combine it with continuation
$ llm cb c "Tell me more about what you see"
[LLM will continue the conversation about the clipboard content]

The clipboard feature works in:

  • Native Windows/macOS/Linux environments
    • Windows: Uses PowerShell
    • macOS: Uses pbpaste for text, pngpaste for images (optional)
    • Linux: Uses xclip (required for clipboard support)
  • Windows Subsystem for Linux (WSL)
    • Accesses the Windows clipboard through PowerShell
    • Works with both text and images
    • Make sure you have access to powershell.exe from WSL

Required tools for clipboard support:

  • Windows: PowerShell (built-in)
  • macOS:
    • pbpaste (built-in) for text
    • pngpaste (optional) for images: brew install pngpaste
  • Linux:
    • xclip: sudo apt install xclip or equivalent

The CLI automatically detects if the clipboard content is text or image and handles it appropriately.

Additional Options

$ llm --list-tools                # List all available tools
$ llm --list-prompts              # List available prompt templates
$ llm --no-tools                  # Run without any tools
$ llm --force-refresh             # Force refresh tool capabilities cache
$ llm --text-only                 # Output raw text without markdown formatting
$ llm --show-memories             # Show user memories
$ llm --model gpt-4               # Override the model specified in config

Contributing

Feel free to submit issues and pull requests for improvements or bug fixes.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

mcp_client_cli_gigachat-1.0.2.tar.gz (220.4 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

mcp_client_cli_gigachat-1.0.2-py3-none-any.whl (25.3 kB view details)

Uploaded Python 3

File details

Details for the file mcp_client_cli_gigachat-1.0.2.tar.gz.

File metadata

File hashes

Hashes for mcp_client_cli_gigachat-1.0.2.tar.gz
Algorithm Hash digest
SHA256 2ce0982106c801b4b722d0823f411ba28117ca38fce115b4d0e90aba7628a554
MD5 398a2d3465fa48be2218733122c1f5da
BLAKE2b-256 dd7b6d01c112995dd03cbf3b52a385e45a77ac599b7263bb34ceb29df03ebbc7

See more details on using hashes here.

File details

Details for the file mcp_client_cli_gigachat-1.0.2-py3-none-any.whl.

File metadata

File hashes

Hashes for mcp_client_cli_gigachat-1.0.2-py3-none-any.whl
Algorithm Hash digest
SHA256 3bfbc414ebe14e66d2538104e101bc2a6380d541cc4ca0f20ba1205eaa17f5b0
MD5 12a9124ee78b426c179664faa8c4e552
BLAKE2b-256 0571ac8b8a551c1d45fdf9a66ec5ca9a7ac68514d0cf3d11da67a14b98cc4aee

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page