Skip to main content

Easy MCP Proxy - Filter, transform, and compose MCP tools

Project description

Easy MCP Proxy

An MCP proxy server that aggregates tools from multiple upstream MCP servers and exposes them through tool views — filtered, transformed, and composed subsets of tools.

Status: Experimental

📖 Full Documentation | 🚀 Tutorial | 📚 Reference

Quick Start

1. Install

uv pip install -e .

2. Create a config file

# config.yaml
mcp_servers:
  filesystem:
    command: npx
    args: [-y, "@modelcontextprotocol/server-filesystem", /home/user/documents]

tool_views:
  default:
    tools:
      filesystem:
        read_file: {}
        list_directory: {}

3. Run the proxy

# For Claude Desktop (stdio)
mcp-proxy serve --config config.yaml

# For HTTP clients
mcp-proxy serve --config config.yaml --transport http --port 8000

4. Use with Claude Desktop

Local (stdio) — runs the proxy as a subprocess:

{
  "mcpServers": {
    "proxy": {
      "command": "uv",
      "args": ["run", "mcp-proxy", "serve", "--config", "/path/to/config.yaml"]
    }
  }
}

Remote (HTTP) — connect to a proxy running on a server:

{
  "mcpServers": {
    "proxy": {
      "type": "http",
      "url": "https://your-proxy-server.example.com/mcp",
      "headers": {
        "Authorization": "Bearer your-auth-token"
      }
    }
  }
}

This requires authentication to be configured on the proxy. See mcp-proxy serve --help for auth options.

Example Use Cases

Reduce Tool Count with Search Mode

Too many tools overwhelming your LLM? Expose hundreds of tools through just two meta-tools:

tool_views:
  everything:
    exposure_mode: search
    include_all: true

This creates everything_search_tools (find tools by description) and everything_call_tool (call by name). The LLM searches first, then calls—no need to list every tool.

Create Domain-Specific Interfaces

Wrap generic filesystem tools into a purpose-built "skills library" interface:

mcp_servers:
  skills:
    command: npx
    args: [-y, "@modelcontextprotocol/server-filesystem", /home/user/skills]
    tools:
      read_file:
        name: get_skill           # Rename for clarity
        parameters:
          path:
            rename: skill_name    # Domain-specific parameter name
            description: "Skill file path (e.g., 'python/debugging.md')"
      directory_tree:
        name: browse_skills
        parameters:
          path:
            hidden: true          # Hide implementation detail
            default: "."          # Always start at root

Search Multiple Sources Concurrently

Create a unified search that queries all your knowledge sources at once:

tool_views:
  unified:
    composite_tools:
      search_everything:
        description: "Search code, docs, and memory simultaneously"
        inputs:
          query: { type: string, required: true }
        parallel:
          code:
            tool: github.search_code
            args: { query: "{inputs.query}" }
          docs:
            tool: confluence.search
            args: { query: "{inputs.query}" }
          memory:
            tool: memory.search
            args: { text: "{inputs.query}" }

Reduce Context Usage with Output Caching

Large tool outputs (file contents, search results) consume valuable LLM context. Cache them and return a preview with a signed retrieval URL:

output_cache:
  enabled: true
  ttl_seconds: 3600        # URLs valid for 1 hour
  preview_chars: 500       # Show first 500 chars inline
  min_size: 10000          # Only cache outputs > 10KB

cache_secret: "${CACHE_SECRET}"
cache_base_url: "https://your-proxy.example.com"

The LLM gets a preview plus a retrieval token—it can load the full content only when needed, or delegate to a sub-agent that fetches and processes the data in its own context window. This enables Recursive Language Model (RLM) patterns where agents pass file references instead of file contents, dramatically reducing context usage while maintaining full access to the data.

What Can It Do?

  • Aggregate multiple MCP servers (stdio or HTTP) into one endpoint
  • Filter which tools are exposed from each server
  • Rename tools and parameters for clearer interfaces
  • Bind parameter defaults or hide implementation details
  • Compose concurrent tools that fan out to multiple upstreams
  • Cache large outputs to reduce context window usage
  • Transform with pre/post hooks for logging, validation, or modification
  • Serve via stdio (Claude Desktop) or HTTP with multi-view routing

See the Use Cases Guide for detailed examples of each capability.

Documentation

  • Introduction — Overview and concepts
  • Tutorial — Step-by-step getting started guide
  • Use Cases — Problem-driven feature exploration
  • Reference — Complete feature and CLI documentation

Development

uv pip install -e ".[dev]"
make check  # Lint
make test   # Run tests (requires 100% coverage)

License

AGPL-3.0 — See LICENSE

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

easy_mcp_proxy-0.3.0.tar.gz (393.4 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

easy_mcp_proxy-0.3.0-py3-none-any.whl (90.7 kB view details)

Uploaded Python 3

File details

Details for the file easy_mcp_proxy-0.3.0.tar.gz.

File metadata

  • Download URL: easy_mcp_proxy-0.3.0.tar.gz
  • Upload date:
  • Size: 393.4 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.13.7

File hashes

Hashes for easy_mcp_proxy-0.3.0.tar.gz
Algorithm Hash digest
SHA256 b903855742109df44c56788cc12118e24f2f1c9de9873602d9a8885475a019ac
MD5 f035754593faf51c0298beadfcf540c6
BLAKE2b-256 2a91d496b2dfdbcb6bd894b50d6360649db92fb96e3ae390c013e5041d7e0206

See more details on using hashes here.

Provenance

The following attestation bundles were made for easy_mcp_proxy-0.3.0.tar.gz:

Publisher: pypi.yml on abrookins/easy-mcp-proxy

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file easy_mcp_proxy-0.3.0-py3-none-any.whl.

File metadata

  • Download URL: easy_mcp_proxy-0.3.0-py3-none-any.whl
  • Upload date:
  • Size: 90.7 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.13.7

File hashes

Hashes for easy_mcp_proxy-0.3.0-py3-none-any.whl
Algorithm Hash digest
SHA256 7fe8fb0a5f4b345c18a410401ed3eb2e32eb9bba9ae65e00d8d358b17bf3f061
MD5 f6c9136418ae1bd307e7d95dff3858b7
BLAKE2b-256 2451d86aa4f5f07c1ec908aea298c9471687647dbeb904c80e85eb707a1df7c9

See more details on using hashes here.

Provenance

The following attestation bundles were made for easy_mcp_proxy-0.3.0-py3-none-any.whl:

Publisher: pypi.yml on abrookins/easy-mcp-proxy

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page