LLM plugin for Model Context Protocol (MCP) integration
Project description
llm-mcp
A comprehensive LLM plugin for Model Context Protocol (MCP) integration, enabling seamless interaction between LLM CLI and MCP servers.
Table of Contents
- Installation
- Quick Start
- Commands Reference
- Usage Examples
- Common Workflows
- Configuration
- Troubleshooting
Installation
pip install llm-mcp
Requirements
- Python 3.8 or higher
- llm >= 0.13.0
- MCP-compatible servers (e.g.,
@modelcontextprotocol/server-filesystem)
Quick Start
- Add an MCP server:
llm mcp add filesystem npx @modelcontextprotocol/server-filesystem /path/to/directory
- List available tools:
llm mcp tools --format list
- Use tools in LLM conversations:
llm -m gpt-4 "List all files in my directory" $(llm mcp tools --format commands)
if you want to add specific server commands
llm -m gpt-4 "List all files in my directory" $(llm mcp tools --server fetch --format commands)
Commands Reference
Server Management
llm mcp add
Register a new MCP server.
Syntax:
llm mcp add <name> <command> [args...] [options]
Parameters:
name(required) - Unique identifier for the servercommand(required) - Command to execute the server (e.g.,npx,python)args(optional) - Additional arguments for the server command
Options:
--env KEY=value- Set environment variables (can be used multiple times)--description- Add a description for the server
Examples:
# Add filesystem server
llm mcp add filesystem npx @modelcontextprotocol/server-filesystem /Users/docs
# First store your GitHub token (one-time setup)
llm keys set GITHUB_PERSONAL_ACCESS_TOKEN
# Add GitHub server (API key automatically resolved from LLM storage)
llm mcp add github npx @modelcontextprotocol/server-github
# Add server with description
llm mcp add myserver python /path/to/server.py \
--description "Custom MCP server for data processing"
# Multiple environment variables
llm mcp add api-server ./server \
--env API_KEY=secret \
--env DEBUG=true \
--env PORT=8080
llm mcp remove
Remove a registered MCP server.
Syntax:
llm mcp remove <name>
Example:
llm mcp remove filesystem
llm mcp list
List all registered MCP servers.
Syntax:
llm mcp list [options]
Options:
--enabled-only- Show only enabled servers--with-status- Include connection status information
Output includes:
- Server name with enabled/disabled indicator (✓/✗)
- Command and arguments
- Description (if provided)
- Environment variable count
- Connection status (with
--with-status) - Available tools count (with
--with-status)
Examples:
# List all servers
llm mcp list
# List only enabled servers with status
llm mcp list --enabled-only --with-status
llm mcp enable
Enable a disabled MCP server.
Syntax:
llm mcp enable <name>
Example:
llm mcp enable filesystem
llm mcp disable
Disable an MCP server without removing it.
Syntax:
llm mcp disable <name>
Example:
llm mcp disable filesystem
llm mcp test
Test connectivity to an MCP server.
Syntax:
llm mcp test <name>
Output includes:
- Connection success/failure status
- Available tools count
- First 5 tool names (if available)
- Error messages (if connection fails)
Example:
llm mcp test filesystem
llm mcp describe
Show detailed information about a specific MCP server.
Syntax:
llm mcp describe <name>
Output includes:
- Server configuration details
- Environment variables (keys only, values hidden)
- Connection status
- Complete list of available tools with descriptions
Example:
llm mcp describe filesystem
llm mcp start
Manually start an MCP server connection.
Syntax:
llm mcp start <name>
Example:
llm mcp start filesystem
llm mcp stop
Stop an active MCP server connection.
Syntax:
llm mcp stop <name>
Example:
llm mcp stop filesystem
Tool Commands
llm mcp tools
List all available MCP tools from enabled servers.
Syntax:
llm mcp tools [options]
Options:
--server <name>- Filter tools by specific server--format <type>- Output format (default: list)list- Detailed format with descriptionsnames- Tool names only, one per linecommands- As-Tflags ready for use with llm
--names-only- (Deprecated) Equivalent to--format names
Examples:
# List all tools with descriptions
llm mcp tools
# Get tools from specific server
llm mcp tools --server filesystem
# Get tool names only
llm mcp tools --format names
# Get ready-to-use command flags
llm mcp tools --format commands
# Output: -T filesystem__read_file -T filesystem__write_file ...
llm mcp call-tool
Call a specific MCP tool directly.
Syntax:
llm mcp call-tool <tool_name> [options]
Parameters:
tool_name(required) - Tool name in formatserver__tool
Options:
--args <json>- JSON object with tool arguments (default: "{}")
Examples:
# Read a file
llm mcp call-tool filesystem__read_file \
--args '{"path": "/tmp/example.txt"}'
# List directory contents
llm mcp call-tool filesystem__list_directory \
--args '{"path": "/Users/docs"}'
# Call with complex arguments
llm mcp call-tool github__search_repositories \
--args '{"query": "language:python stars:>1000", "limit": 10}'
llm mcp status
Show overall MCP plugin status and statistics.
Syntax:
llm mcp status
Output includes:
- Total registered servers count
- Enabled servers count
- Connected servers count
- Available tools count
- Configuration directory path
- Log directory path
Example:
llm mcp status
Usage Examples
Basic Server Setup
# 1. Add a filesystem server for your documents
llm mcp add docs npx @modelcontextprotocol/server-filesystem ~/Documents
# 2. Store API keys securely (one-time setup)
llm keys set GITHUB_PERSONAL_ACCESS_TOKEN
# 3. Add a GitHub server (API key automatically resolved)
llm mcp add github npx @modelcontextprotocol/server-github
# 4. Verify servers are working
llm mcp test docs
llm mcp test github
# 5. List all available tools
llm mcp tools
Using Tools with LLM
# Method 1: Use the tools in a conversation
llm -m gpt-4 \
$(llm mcp tools --server docs --format commands) \
"What markdown files are in my Documents folder?"
# Method 2: Specify individual tools
llm -m claude-3-opus \
-T docs__read_file \
-T docs__write_file \
"Update the README.md file to include installation instructions"
# Method 3: Use all available tools
llm -m gpt-4 $(llm mcp tools --format commands) \
"Analyze the project structure and create a summary"
Direct Tool Invocation
# List files in a directory
llm mcp call-tool docs__list_directory \
--args '{"path": "/Users/me/Documents"}'
# Read a specific file
llm mcp call-tool docs__read_file \
--args '{"path": "/Users/me/Documents/notes.md"}'
# Search GitHub repositories
llm mcp call-tool github__search_repositories \
--args '{"query": "mcp server", "limit": 5}'
Common Workflows
1. Document Management Workflow
# Setup filesystem server for documents
llm mcp add documents npx @modelcontextprotocol/server-filesystem \
~/Documents ~/Projects
# Use with LLM to organize files
llm -m gpt-4 $(llm mcp tools --server documents --format commands) \
"Create a summary of all README files in my Projects folder"
2. Code Analysis Workflow
# Add server for code directory
llm mcp add codebase npx @modelcontextprotocol/server-filesystem \
/path/to/codebase
# Analyze code structure
llm -m claude-3-opus $(llm mcp tools --server codebase --format commands) \
"Analyze the Python files and identify potential refactoring opportunities"
3. Multi-Server Workflow
# Store API keys once
llm keys set GITHUB_PERSONAL_ACCESS_TOKEN
# Add multiple servers (API keys automatically resolved)
llm mcp add docs npx @modelcontextprotocol/server-filesystem ~/Documents
llm mcp add code npx @modelcontextprotocol/server-filesystem ~/Code
llm mcp add github npx @modelcontextprotocol/server-github
# Use all tools together
llm -m gpt-4 $(llm mcp tools --format commands) \
"Compare my local documentation with similar projects on GitHub"
4. Automatic API Key Resolution
The plugin automatically resolves common API keys from LLM's secure storage, eliminating the need for --env flags:
# 1. Store API keys securely using LLM's key storage (one-time setup)
llm keys set FIRECRAWL_API_KEY
llm keys set GITHUB_PERSONAL_ACCESS_TOKEN
llm keys set OPENAI_API_KEY
# 2. Add servers without needing to specify --env flags
llm mcp add firecrawl npx -- -y firecrawl-mcp
llm mcp add github npx @modelcontextprotocol/server-github
# 3. Test servers - API keys are automatically resolved
llm mcp test firecrawl # ✓ Uses FIRECRAWL_API_KEY from storage
llm mcp test github # ✓ Uses GITHUB_PERSONAL_ACCESS_TOKEN from storage
Supported API Keys (automatically resolved):
FIRECRAWL_API_KEY- Firecrawl web scraping serviceGITHUB_PERSONAL_ACCESS_TOKEN,GITHUB_TOKEN- GitHub API accessOPENAI_API_KEY- OpenAI API accessANTHROPIC_API_KEY- Anthropic API accessGOOGLE_API_KEY- Google servicesBRAVE_SEARCH_API_KEY- Brave Search APITAVILY_API_KEY- Tavily search API
Resolution Priority:
- Environment variable (if already set)
- LLM key storage (
llm keys get KEY_NAME) - Server throws error if not found
5. Server Management Workflow
# Check overall status
llm mcp status
# List all servers with their status
llm mcp list --with-status
# Disable unused servers
llm mcp disable old-server
# Test specific server
llm mcp test docs
# Get detailed information
llm mcp describe docs
Configuration
The llm-mcp plugin stores its configuration in the LLM configuration directory:
- Config Directory:
~/.config/io.datasette.llm/mcp/(Unix/Linux/macOS) - Config Directory:
%APPDATA%\io.datasette.llm\mcp\(Windows)
Configuration Files
servers.json- Server configurationslogs/- Server connection logs
Environment Variables
Environment variables for servers are stored securely in the configuration and are not exposed in plain text when listing servers.
To update environment variables:
# Remove and re-add the server with new variables
llm mcp remove myserver
llm mcp add myserver command args --env NEW_KEY=new_value
Troubleshooting
Common Issues
Server won't connect
# Test the connection
llm mcp test servername
# Check server status
llm mcp describe servername
# Try restarting the server
llm mcp stop servername
llm mcp start servername
Tools not appearing
# Ensure server is enabled
llm mcp enable servername
# List tools for specific server
llm mcp tools --server servername
# Check server has tools available
llm mcp test servername
Environment variable issues
# Environment variables must be in KEY=value format
llm mcp add server command --env KEY=value # ✓ Correct
llm mcp add server command --env KEY value # ✗ Wrong
Debug Commands
# Get detailed server information
llm mcp describe servername
# Check overall plugin status
llm mcp status
# View server list with connection status
llm mcp list --with-status
# Test individual server connectivity
llm mcp test servername
Contributing
This is an open-source project. Contributions are welcome!
Development Setup
# Clone the repository
git clone https://github.com/eugenepyvovarov/llm-mcp.git
cd llm-mcp
# Install in development mode
pip install -e ".[dev]"
# Run tests
pytest
License
Apache 2.0 License - see LICENSE file for details.
Support
For issues, questions, or suggestions:
- GitHub Issues: github.com/eugenepyvovarov/llm-mcp/issues
- Documentation: This README
- MCP Specification: modelcontextprotocol.io
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file llm_mcp_cli-1.0.2.tar.gz.
File metadata
- Download URL: llm_mcp_cli-1.0.2.tar.gz
- Upload date:
- Size: 29.1 kB
- Tags: Source
- Uploaded using Trusted Publishing? Yes
- Uploaded via: twine/6.1.0 CPython/3.12.9
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
65204958f739ec19dda99f99ba7beaba4a8a1fb9687c7c11f4ee0ffbd6e46618
|
|
| MD5 |
0c366fa5bfa7c8a66e8a64cd926eecd2
|
|
| BLAKE2b-256 |
805d820a90346c6cd0b1ab4db95a849d69503b001961ea41454b3c7d0c80a3f3
|
Provenance
The following attestation bundles were made for llm_mcp_cli-1.0.2.tar.gz:
Publisher:
publish.yml on eugenepyvovarov/llm-mcp
-
Statement:
-
Statement type:
https://in-toto.io/Statement/v1 -
Predicate type:
https://docs.pypi.org/attestations/publish/v1 -
Subject name:
llm_mcp_cli-1.0.2.tar.gz -
Subject digest:
65204958f739ec19dda99f99ba7beaba4a8a1fb9687c7c11f4ee0ffbd6e46618 - Sigstore transparency entry: 444416212
- Sigstore integration time:
-
Permalink:
eugenepyvovarov/llm-mcp@ff2e1c8a43741d0604c64a1d489c317f45bc6ab9 -
Branch / Tag:
refs/tags/v1.0.2 - Owner: https://github.com/eugenepyvovarov
-
Access:
public
-
Token Issuer:
https://token.actions.githubusercontent.com -
Runner Environment:
github-hosted -
Publication workflow:
publish.yml@ff2e1c8a43741d0604c64a1d489c317f45bc6ab9 -
Trigger Event:
release
-
Statement type:
File details
Details for the file llm_mcp_cli-1.0.2-py3-none-any.whl.
File metadata
- Download URL: llm_mcp_cli-1.0.2-py3-none-any.whl
- Upload date:
- Size: 28.7 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? Yes
- Uploaded via: twine/6.1.0 CPython/3.12.9
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
dcb81e8f805059305d05ed293a3d868cf2f183128eadff17c7245f55465b81a1
|
|
| MD5 |
cc374fc2d3240190ad80855e75a903b8
|
|
| BLAKE2b-256 |
fb5295f3f7f1f1a17be567b7bf1a40a159d5e718d13b64760319b92ad5d1a706
|
Provenance
The following attestation bundles were made for llm_mcp_cli-1.0.2-py3-none-any.whl:
Publisher:
publish.yml on eugenepyvovarov/llm-mcp
-
Statement:
-
Statement type:
https://in-toto.io/Statement/v1 -
Predicate type:
https://docs.pypi.org/attestations/publish/v1 -
Subject name:
llm_mcp_cli-1.0.2-py3-none-any.whl -
Subject digest:
dcb81e8f805059305d05ed293a3d868cf2f183128eadff17c7245f55465b81a1 - Sigstore transparency entry: 444416238
- Sigstore integration time:
-
Permalink:
eugenepyvovarov/llm-mcp@ff2e1c8a43741d0604c64a1d489c317f45bc6ab9 -
Branch / Tag:
refs/tags/v1.0.2 - Owner: https://github.com/eugenepyvovarov
-
Access:
public
-
Token Issuer:
https://token.actions.githubusercontent.com -
Runner Environment:
github-hosted -
Publication workflow:
publish.yml@ff2e1c8a43741d0604c64a1d489c317f45bc6ab9 -
Trigger Event:
release
-
Statement type: