Skip to main content

A Python library for managing connections to multiple Model Context Protocol (MCP) servers

Project description

MCP Multi-Server

Python 3.10+ License: MIT

A Python library for managing connections to multiple Model Context Protocol (MCP) servers. This library provides a unified interface for discovering, aggregating, and routing capabilities (tools, resources, prompts) across multiple MCP servers.

Features

  • Multi-Server Management: Connect to and manage multiple MCP servers simultaneously
  • Automatic Capability Discovery: Discover tools, resources, prompts, and templates from all connected servers
  • Intelligent Routing: Automatically route tool calls, resource reads, and prompt retrievals to the correct server
  • Namespace Support: Use namespaced URIs for unambiguous resource routing
  • Collision Detection: Detect and warn about duplicate tool or prompt names across servers
  • Async Context Manager: Clean resource management with Python's async context managers
  • Sync & Async Support: Both async (MultiServerClient) and synchronous (SyncMultiServerClient) interfaces

Installation

pip install mcp-multi-server

Or with Poetry:

poetry add mcp-multi-server

Optional Dependencies

For OpenAI integration:

pip install mcp-multi-server[openai]

For running examples:

pip install mcp-multi-server[examples]

Quick Start

1. Create a Server Configuration File

Create a mcp_servers.json file defining your MCP servers:

{
  "mcpServers": {
    "filesystem": {
      "command": "python",
      "args": ["-m", "my_servers.filesystem_server"]
    },
    "database": {
      "command": "python",
      "args": ["-m", "my_servers.database_server"]
    }
  }
}

2. Use the Multi-Server Client

import asyncio
from mcp_multi_server import MultiServerClient

async def main():
    # Using context manager (recommended)
    async with MultiServerClient.from_config("mcp_servers.json") as client:
        # List all available tools from all servers
        tools = client.list_tools()
        print(f"Found {len(tools.tools)} tools")

        # Call a tool (automatically routed to the correct server)
        result = await client.call_tool(
            "read_file",
            {"path": "/path/to/file.txt"}
        )

        # List all resources with namespaced URIs
        resources = client.list_resources()

        # Read a resource (auto-routing via namespace)
        content = await client.read_resource(resources.resources[0].uri)

        # Get a prompt
        prompt = await client.get_prompt("code_review", {"language": "python"})

asyncio.run(main())

3. Programmatic Configuration

You can also configure servers programmatically without a JSON file:

from mcp_multi_server import MultiServerClient, MCPServersConfig, ServerConfig

config = MCPServersConfig(mcpServers={
    "my_server": ServerConfig(
        command="python",
        args=["-m", "my_package.my_server"]
    )
})

async with MultiServerClient.from_dict(config.model_dump()) as client:
    tools = client.list_tools()
    # ...

4. Synchronous Client

For non-async code, use the synchronous wrapper:

from mcp_multi_server import SyncMultiServerClient

# Using context manager (recommended)
with SyncMultiServerClient.from_config("mcp_servers.json") as client:
    tools = client.list_tools()
    result = client.call_tool("read_file", {"path": "/path/to/file.txt"})
    resources = client.list_resources()

# Or with programmatic configuration
config = {"mcpServers": {"my_server": {"command": "python", "args": ["-m", "my_server"]}}}
with SyncMultiServerClient.from_dict(config) as client:
    tools = client.list_tools()

The sync client runs a background event loop thread and provides the same API as the async client, with optional timeout parameters on blocking methods.

Examples

The repository includes comprehensive examples demonstrating various use cases. See the examples directory for:

  • Example MCP server implementations (tools, resources, prompts)
  • Example chat client showing usage patterns
  • Full client with OpenAI integration

API Reference

MultiServerClient

Main class for managing multiple MCP servers.

Class Methods:

  • from_config(config_path: str) - Create client from JSON config file
  • from_dict(config_dict: Dict) - Create client from configuration dictionary

Instance Methods:

  • connect_all(stack: AsyncExitStack) - Connect to all configured servers
  • list_tools() - Get all tools from all servers
  • list_prompts() - Get all prompts from all servers
  • list_resources(use_namespace: bool = True) - Get all resources
  • list_resource_templates(use_namespace: bool = True) - Get all resource templates
  • call_tool(name, arguments, server_name=None) - Call a tool
  • read_resource(uri, server_name=None) - Read a resource
  • get_prompt(name, arguments=None, server_name=None) - Get a prompt

SyncMultiServerClient

Synchronous wrapper for non-async code. Same API as MultiServerClient.

Class Methods:

  • from_config(config_path: str) - Create client from JSON config file
  • from_dict(config_dict: Dict) - Create client from configuration dictionary

Instance Methods:

  • list_tools() - Get all tools from all servers
  • list_prompts() - Get all prompts from all servers
  • list_resources(use_namespace: bool = True) - Get all resources
  • list_resource_templates(use_namespace: bool = True) - Get all resource templates
  • call_tool(name, arguments, timeout=None, server_name=None) - Call a tool
  • read_resource(uri, timeout=None, server_name=None) - Read a resource
  • get_prompt(name, arguments=None, timeout=None, server_name=None) - Get a prompt
  • shutdown() - Explicitly shutdown the client

Utility Functions

  • print_capabilities_summary(client) - Print client discovered capabilities
  • mcp_tools_to_openai_format(tools) - Convert MCP tools to OpenAI function format
  • format_namespace_uri(server_name, uri) - Create namespaced URI
  • parse_namespace_uri(uri) - Parse namespaced URI
  • extract_template_variables(template) - Extract variables from URI template
  • substitute_template_variables(template, variables) - Substitute template variables

Contributing

Contributions are welcome! Please feel free to submit a Pull Request.

License

This project is licensed under the MIT License - see the LICENSE file for details.

Links

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

mcp_multi_server-1.1.0.tar.gz (22.0 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

mcp_multi_server-1.1.0-py3-none-any.whl (22.9 kB view details)

Uploaded Python 3

File details

Details for the file mcp_multi_server-1.1.0.tar.gz.

File metadata

  • Download URL: mcp_multi_server-1.1.0.tar.gz
  • Upload date:
  • Size: 22.0 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: poetry/2.3.1 CPython/3.14.2 Linux/6.11.0-1018-azure

File hashes

Hashes for mcp_multi_server-1.1.0.tar.gz
Algorithm Hash digest
SHA256 a9c5c3cd5c625fa4d52fc6f44f6ebb76fa418f45034417a5f1258a082f6a3b3a
MD5 345f5789b9ce4b3bd299126ea3e64b5a
BLAKE2b-256 443a27a941ee25ec41bd7f5c59a8d614b4bb4fbcf3af41640eb6c938959ffe97

See more details on using hashes here.

File details

Details for the file mcp_multi_server-1.1.0-py3-none-any.whl.

File metadata

  • Download URL: mcp_multi_server-1.1.0-py3-none-any.whl
  • Upload date:
  • Size: 22.9 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: poetry/2.3.1 CPython/3.14.2 Linux/6.11.0-1018-azure

File hashes

Hashes for mcp_multi_server-1.1.0-py3-none-any.whl
Algorithm Hash digest
SHA256 8d7dbfa1752cda869f20c2058140ed3b0e03a48a269ef508c5046d592e7cf6c0
MD5 6fdface6c743b612ef801fce629197f6
BLAKE2b-256 a4c126ebd92d1ba30a8035cf510484c590cca106466d1f65379d0ff3d814ef48

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page