Skip to main content

A Python library for managing connections to multiple Model Context Protocol (MCP) servers

Project description

MCP Multi-Server

Python 3.10+ License: MIT

A Python library for managing connections to multiple Model Context Protocol (MCP) servers. This library provides a unified interface for discovering, aggregating, and routing capabilities (tools, resources, prompts) across multiple MCP servers.

Features

  • Multi-Server Management: Connect to and manage multiple MCP servers simultaneously
  • Automatic Capability Discovery: Discover tools, resources, prompts, and templates from all connected servers
  • Intelligent Routing: Automatically route tool calls, resource reads, and prompt retrievals to the correct server
  • Namespace Support: Use namespaced URIs for unambiguous resource routing
  • Collision Detection: Detect and warn about duplicate tool or prompt names across servers
  • Async Context Manager: Clean resource management with Python's async context managers

Installation

pip install mcp-multi-server

Or with Poetry:

poetry add mcp-multi-server

Optional Dependencies

For OpenAI integration:

pip install mcp-multi-server[openai]

For running examples:

pip install mcp-multi-server[examples]

Quick Start

1. Create a Server Configuration File

Create a mcp_servers.json file defining your MCP servers:

{
  "mcpServers": {
    "filesystem": {
      "command": "python",
      "args": ["-m", "my_servers.filesystem_server"]
    },
    "database": {
      "command": "python",
      "args": ["-m", "my_servers.database_server"]
    }
  }
}

2. Use the Multi-Server Client

import asyncio
from mcp_multi_server import MultiServerClient

async def main():
    # Using context manager (recommended)
    async with MultiServerClient.from_config("mcp_servers.json") as client:
        # List all available tools from all servers
        tools = client.list_tools()
        print(f"Found {len(tools.tools)} tools")

        # Call a tool (automatically routed to the correct server)
        result = await client.call_tool(
            "read_file",
            {"path": "/path/to/file.txt"}
        )

        # List all resources with namespaced URIs
        resources = client.list_resources()

        # Read a resource (auto-routing via namespace)
        content = await client.read_resource(resources.resources[0].uri)

        # Get a prompt
        prompt = await client.get_prompt("code_review", {"language": "python"})

asyncio.run(main())

3. Programmatic Configuration

You can also configure servers programmatically without a JSON file:

from mcp_multi_server import MultiServerClient, MCPServersConfig, ServerConfig

config = MCPServersConfig(mcpServers={
    "my_server": ServerConfig(
        command="python",
        args=["-m", "my_package.my_server"]
    )
})

async with MultiServerClient.from_dict(config.model_dump()) as client:
    tools = client.list_tools()
    # ...

Examples

The repository includes comprehensive examples demonstrating various use cases. See the examples directory for:

  • Example MCP server implementations (tools, resources, prompts)
  • Example chat client showing usage patterns
  • Full client with OpenAI integration

API Reference

MultiServerClient

Main class for managing multiple MCP servers.

Class Methods:

  • from_config(config_path: str) - Create client from JSON config file
  • from_dict(config_dict: Dict) - Create client from configuration dictionary

Instance Methods:

  • connect_all(stack: AsyncExitStack) - Connect to all configured servers
  • list_tools() - Get all tools from all servers
  • list_prompts() - Get all prompts from all servers
  • list_resources(use_namespace: bool = True) - Get all resources
  • list_resource_templates(use_namespace: bool = True) - Get all resource templates
  • call_tool(name, arguments, server_name=None) - Call a tool
  • read_resource(uri, server_name=None) - Read a resource
  • get_prompt(name, arguments=None, server_name=None) - Get a prompt

Utility Functions

  • print_capabilities_summary(client) - Print client discovered capabilities
  • mcp_tools_to_openai_format(tools) - Convert MCP tools to OpenAI function format
  • format_namespace_uri(server_name, uri) - Create namespaced URI
  • parse_namespace_uri(uri) - Parse namespaced URI
  • extract_template_variables(template) - Extract variables from URI template
  • substitute_template_variables(template, variables) - Substitute template variables

Contributing

Contributions are welcome! Please feel free to submit a Pull Request.

License

This project is licensed under the MIT License - see the LICENSE file for details.

Links

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

mcp_multi_server-1.0.0.tar.gz (17.4 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

mcp_multi_server-1.0.0-py3-none-any.whl (17.0 kB view details)

Uploaded Python 3

File details

Details for the file mcp_multi_server-1.0.0.tar.gz.

File metadata

  • Download URL: mcp_multi_server-1.0.0.tar.gz
  • Upload date:
  • Size: 17.4 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: poetry/2.2.1 CPython/3.14.0 Linux/6.11.0-1018-azure

File hashes

Hashes for mcp_multi_server-1.0.0.tar.gz
Algorithm Hash digest
SHA256 4e140523593b3d1c7fb42d265d787290564503b3b1097d6bc9d97be46952f8c0
MD5 c5c22eab4a05ce6e4e921db5aa0cb7ef
BLAKE2b-256 9a6db9ffb976f22e0f83b336ecde7408b59311ae6559b63cc8320aa2ae2756aa

See more details on using hashes here.

File details

Details for the file mcp_multi_server-1.0.0-py3-none-any.whl.

File metadata

  • Download URL: mcp_multi_server-1.0.0-py3-none-any.whl
  • Upload date:
  • Size: 17.0 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: poetry/2.2.1 CPython/3.14.0 Linux/6.11.0-1018-azure

File hashes

Hashes for mcp_multi_server-1.0.0-py3-none-any.whl
Algorithm Hash digest
SHA256 ea44890c890eb9e2c3c3a89e7f24c415f44b055966967984157b254301e73c94
MD5 a5256786b048fc5e0da885b6992de835
BLAKE2b-256 9e50a7678209b926767ab45e751dd1ef85caf8d8381622befdfc22a597c27e89

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page