A Python library for managing connections to multiple Model Context Protocol (MCP) servers
Project description
MCP Multi-Server
A Python library for managing connections to multiple Model Context Protocol (MCP) servers. This library provides a unified interface for discovering, aggregating, and routing capabilities (tools, resources, prompts) across multiple MCP servers.
Features
- Multi-Server Management: Connect to and manage multiple MCP servers simultaneously
- Automatic Capability Discovery: Discover tools, resources, prompts, and templates from all connected servers
- Intelligent Routing: Automatically route tool calls, resource reads, and prompt retrievals to the correct server
- Namespace Support: Use namespaced URIs for unambiguous resource routing
- Collision Detection: Detect and warn about duplicate tool or prompt names across servers
- Async Context Manager: Clean resource management with Python's async context managers
- Sync & Async Support: Both async (
MultiServerClient) and synchronous (SyncMultiServerClient) interfaces
Installation
pip install mcp-multi-server
Or with Poetry:
poetry add mcp-multi-server
Optional Dependencies
For OpenAI integration:
pip install mcp-multi-server[openai]
For running examples:
pip install mcp-multi-server[examples]
Quick Start
1. Create a Server Configuration File
Create a mcp_servers.json file defining your MCP servers:
{
"mcpServers": {
"filesystem": {
"command": "python",
"args": ["-m", "my_servers.filesystem_server"]
},
"database": {
"command": "python",
"args": ["-m", "my_servers.database_server"]
}
}
}
2. Use the Multi-Server Client
import asyncio
from mcp_multi_server import MultiServerClient
async def main():
# Using context manager (recommended)
async with MultiServerClient.from_config("mcp_servers.json") as client:
# List all available tools from all servers
tools = client.list_tools()
print(f"Found {len(tools.tools)} tools")
# Call a tool (automatically routed to the correct server)
result = await client.call_tool(
"read_file",
{"path": "/path/to/file.txt"}
)
# List all resources with namespaced URIs
resources = client.list_resources()
# Read a resource (auto-routing via namespace)
content = await client.read_resource(resources.resources[0].uri)
# Get a prompt
prompt = await client.get_prompt("code_review", {"language": "python"})
asyncio.run(main())
3. Programmatic Configuration
You can also configure servers programmatically without a JSON file:
from mcp_multi_server import MultiServerClient, MCPServersConfig, ServerConfig
config = MCPServersConfig(mcpServers={
"my_server": ServerConfig(
command="python",
args=["-m", "my_package.my_server"]
)
})
async with MultiServerClient.from_dict(config.model_dump()) as client:
tools = client.list_tools()
# ...
4. Synchronous Client
For non-async code, use the synchronous wrapper:
from mcp_multi_server import SyncMultiServerClient
# Using context manager (recommended)
with SyncMultiServerClient.from_config("mcp_servers.json") as client:
tools = client.list_tools()
result = client.call_tool("read_file", {"path": "/path/to/file.txt"})
resources = client.list_resources()
# Or with programmatic configuration
config = {"mcpServers": {"my_server": {"command": "python", "args": ["-m", "my_server"]}}}
with SyncMultiServerClient.from_dict(config) as client:
tools = client.list_tools()
The sync client runs a background event loop thread and provides the same API as the async client, with optional timeout parameters on blocking methods.
Examples
The repository includes comprehensive examples demonstrating various use cases. See the examples directory for:
- Example MCP server implementations (tools, resources, prompts)
- Example chat client showing usage patterns
- Full client with OpenAI integration
API Reference
MultiServerClient
Main class for managing multiple MCP servers.
Class Methods:
from_config(config_path: str)- Create client from JSON config filefrom_dict(config_dict: Dict)- Create client from configuration dictionary
Instance Methods:
connect_all(stack: AsyncExitStack)- Connect to all configured serverslist_tools()- Get all tools from all serverslist_prompts()- Get all prompts from all serverslist_resources(use_namespace: bool = True)- Get all resourceslist_resource_templates(use_namespace: bool = True)- Get all resource templatescall_tool(name, arguments, server_name=None)- Call a toolread_resource(uri, server_name=None)- Read a resourceget_prompt(name, arguments=None, server_name=None)- Get a prompt
SyncMultiServerClient
Synchronous wrapper for non-async code. Same API as MultiServerClient.
Class Methods:
from_config(config_path: str)- Create client from JSON config filefrom_dict(config_dict: Dict)- Create client from configuration dictionary
Instance Methods:
list_tools()- Get all tools from all serverslist_prompts()- Get all prompts from all serverslist_resources(use_namespace: bool = True)- Get all resourceslist_resource_templates(use_namespace: bool = True)- Get all resource templatescall_tool(name, arguments, timeout=None, server_name=None)- Call a toolread_resource(uri, timeout=None, server_name=None)- Read a resourceget_prompt(name, arguments=None, timeout=None, server_name=None)- Get a promptshutdown()- Explicitly shutdown the client
Utility Functions
print_capabilities_summary(client)- Print client discovered capabilitiesmcp_tools_to_openai_format(tools)- Convert MCP tools to OpenAI function formatformat_namespace_uri(server_name, uri)- Create namespaced URIparse_namespace_uri(uri)- Parse namespaced URIextract_template_variables(template)- Extract variables from URI templatesubstitute_template_variables(template, variables)- Substitute template variables
Contributing
Contributions are welcome! Please feel free to submit a Pull Request.
License
This project is licensed under the MIT License - see the LICENSE file for details.
Links
- Documentation: https://mcp-multi-server.readthedocs.io/
- Source Code: https://github.com/apisani1/mcp-multi-server
- Issue Tracker: https://github.com/apisani1/mcp-multi-server/issues
- MCP Protocol: https://modelcontextprotocol.io
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file mcp_multi_server-1.1.0.post1.tar.gz.
File metadata
- Download URL: mcp_multi_server-1.1.0.post1.tar.gz
- Upload date:
- Size: 22.0 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: poetry/2.3.2 CPython/3.14.3 Linux/6.14.0-1017-azure
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
fd218901a3003cb92b79624c9f8023fd4b1faba9687b7cda9c1dba8341fce32c
|
|
| MD5 |
b161829699f7c4b8250791bffba17b31
|
|
| BLAKE2b-256 |
a0637fafb938c1de9f68e5c0cf1170c184c3c30134d3c2e9657f097a99c3d18f
|
File details
Details for the file mcp_multi_server-1.1.0.post1-py3-none-any.whl.
File metadata
- Download URL: mcp_multi_server-1.1.0.post1-py3-none-any.whl
- Upload date:
- Size: 22.9 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: poetry/2.3.2 CPython/3.14.3 Linux/6.14.0-1017-azure
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
07fbfe196fe3a3a2fab87b1b1be7b427226b41431c4e272138cd8f7e5e4ba810
|
|
| MD5 |
c8b6226d7fb515eef1cf1d7e3c7f34b7
|
|
| BLAKE2b-256 |
2caaf9bc3cb24ab9ba56daf96279bcdb3ae0613f0164d0647d4b8da569407931
|