Model Context Protocol (MCP) Manager - a tool for managing MCP servers
Project description
MCPMan (MCP Manager)
MCPMan orchestrates interactions between LLMs and Model Context Protocol (MCP) servers, making it easy to create powerful agentic workflows.
Quick Start
Run MCPMan instantly without installing using uvx:
# Run with OpenAI
uvx mcpman -c server_configs/multi_server_mcp.json -i openai -m gpt-4o -p "Write a short poem about robots"
# Run with Claude
uvx mcpman -c server_configs/calculator_server_mcp.json -i anthropic -m claude-3-sonnet-20240229 -p "Calculate 245 * 378"
# Run with a local Ollama model
uvx mcpman -c server_configs/filesystem_server_mcp.json -i ollama -m llama3:8b -p "List files in this directory"
You can also use uv run for quick one-off executions:
uv run github.com/ericflo/mcpman -c server_configs/multi_server_mcp.json -i openai -m gpt-4o -p "What time is it in Tokyo?"
Core Features
- One-command setup: Manage and launch MCP servers directly
- Tool orchestration: Automatically connect LLMs to any MCP-compatible tool
- Detailed logging: JSON structured logs for every interaction
- Multiple LLM support: Works with OpenAI, Anthropic, Google, Ollama, LMStudio and more
- Flexible configuration: Supports stdio and SSE server communication
Installation
# Install with pip
pip install mcpman
# Install with uv
uv pip install mcpman
# Install from GitHub
uvx pip install git+https://github.com/ericflo/mcpman.git
Basic Usage
mcpman -c <CONFIG_FILE> -i <IMPLEMENTATION> -m <MODEL> -p "<PROMPT>"
Examples:
# Use local models with Ollama
mcpman -c ./server_configs/filesystem_server_mcp.json \
-i ollama \
-m gemma3:4b-it-qat \
-p "List files in the current directory and count the lines in README.md"
# Use OpenAI with system message
mcpman -c ./server_configs/multi_server_mcp.json \
-i openai \
-m gpt-4o \
-s "You are a helpful assistant. Use tools effectively." \
-p "What time is it in Tokyo right now and what's the weather like there?"
Server Configuration
MCPMan uses JSON configuration files to define the MCP servers. Examples:
Node.js stdio Server:
{
"mcpServers": {
"calculator": {
"command": "npx",
"args": ["-y", "mcp-server"],
"env": { "API_KEY": "value" }
}
}
}
Python stdio Server:
{
"mcpServers": {
"datetime": {
"command": "python",
"args": ["-m", "mcp_servers.datetime_utils"],
"env": { "TIMEZONE_API_KEY": "abc123" }
}
}
}
SSE Server (manually managed):
{
"mcpServers": {
"filesystem": {
"url": "http://localhost:3000/sse"
}
}
}
Key Options
| Option | Description |
|---|---|
-c, --config <PATH> |
Path to MCP server config file |
-i, --implementation <IMPL> |
LLM implementation (openai, anthropic, google, ollama, lmstudio) |
-m, --model <MODEL> |
Model name (gpt-4o, claude-3-opus-20240229, etc.) |
-p, --prompt <PROMPT> |
User prompt (text or file path) |
-s, --system <MESSAGE> |
Optional system message |
--base-url <URL> |
Custom endpoint URL |
--temperature <FLOAT> |
Sampling temperature (default: 0.7) |
--max-tokens <INT> |
Maximum response tokens |
--no-verify |
Disable task verification |
API keys are set via environment variables: OPENAI_API_KEY, ANTHROPIC_API_KEY, etc.
Why MCPMan?
- Standardized interaction: Unified interface for diverse tools
- Simplified development: Abstract away LLM-specific tool call formats
- Debugging support: Detailed JSONL logs for every step in the agent process
- Local or cloud: Works with local or cloud-based LLMs
Supported LLMs
- OpenAI (GPT models)
- Anthropic (Claude models)
- Google Gemini
- OpenRouter
- Ollama (local models)
- LM Studio (local models)
Development Setup
# Clone and setup
git clone https://github.com/ericflo/mcpman.git
cd mcpman
# Create environment and install deps
uvx venv
source .venv/bin/activate # Linux/macOS
# or .venv\Scripts\activate # Windows
uvx pip install -e ".[dev]"
# Run tests
pytest tests/
Project Structure
src/mcpman/: Core source codemcp_servers/: Example MCP servers for testingserver_configs/: Example configuration fileslogs/: Auto-generated structured JSONL logs
License
Licensed under the Apache License 2.0.
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file mcpman-0.1.0.tar.gz.
File metadata
- Download URL: mcpman-0.1.0.tar.gz
- Upload date:
- Size: 28.2 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.1.0 CPython/3.12.9
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
5d0c3977a0f2597bc4f01f311d2f07c91d71ac712ae715eeb20269c4d5c52b8b
|
|
| MD5 |
bb459823c78ca3cbdae6d86e957ee78a
|
|
| BLAKE2b-256 |
295745ea2c2ab069afbc288a351e9a5c21349518fc332c22669c84b47400b328
|
File details
Details for the file mcpman-0.1.0-py3-none-any.whl.
File metadata
- Download URL: mcpman-0.1.0-py3-none-any.whl
- Upload date:
- Size: 30.0 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.1.0 CPython/3.12.9
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
ea738b5a2dfae2391904487a4c51aaf39192611b859e446d7c5e91cc99a95af9
|
|
| MD5 |
5df67aac4d725d9c0e0815627779d874
|
|
| BLAKE2b-256 |
9b7019d22b7ba51e836332fc96593a779989c523a7e5f70c573ac344597c22a9
|