Skip to main content

Model-Agnostic MCP Library for LLMs

Project description

Model-Agnostic MCP Library for LLMs

A Python library that lets any LLM (Language Learning Model) use MCP (Multi-Channel Platform) tools through a unified interface. The goal is to let developers easily connect any LLM to tools like web browsing, file operations, etc.

Core Concept

  • Leverage existing LangChain adapters rather than reinventing them
  • Focus on bridging MCPs and LangChain's tool ecosystem

Key Components

Connectors

Bridge to MCP implementations:

  • stdio.py: For local MCP processes
  • websocket.py: For remote WebSocket MCPs
  • http.py: For HTTP API MCPs

Tool Conversion

Convert between MCP and LangChain formats:

  • Convert MCP tool schemas to formats needed by different LLMs
  • Support OpenAI function calling, Anthropic tool format, etc.

Session Management

Handle connection lifecycle:

  • Authenticate and initialize MCP connections
  • Discover and register available tools
  • Handle tool calling with proper error management

Agent Integration

Ready-to-use agent implementations:

  • Pre-configured for MCP tool usage
  • Optimized prompts for tool selection

Installation

pip install mcpeer

Or install from source:

git clone https://github.com/pietrozullo/mcpeer.git
cd mcpeer
pip install -e .

Quick Start

Here's a simple example to get you started:

import asyncio
from mcp import StdioServerParameters
from mcpeer import MCPAgent

async def main():
    # Create server parameters for stdio connection
    server_params = StdioServerParameters(
        command="npx",
        args=["@playwright/mcp@latest"],
    )

    # Create a model-agnostic MCP client
    mcp_client = MCPAgent(
        server_params=server_params,
        model_provider="anthropic",  # Or "openai"
        model_name="claude-3-7-sonnet-20250219",  # Or "gpt-4o" for OpenAI
        temperature=0.7
    )

    # Initialize the client
    await mcp_client.initialize()

    # Run a query using the agent with tools
    result = await mcp_client.run_query(
        "Using internet tell me how many people work at OpenAI"
    )

    print("Result:")
    print(result)

    # Close the client
    await mcp_client.close()

if __name__ == "__main__":
    asyncio.run(main())

Simplified Usage

You can also use the simplified interface that handles connector lifecycle management automatically:

import asyncio
from langchain_openai import ChatOpenAI
from mcpeer import MCPAgent
from mcpeer.connectors.stdio import StdioConnector

async def main():
    # Create the connector
    connector = StdioConnector(
        command="npx",
        args=["@playwright/mcp@latest"],
    )

    # Create the LLM
    llm = ChatOpenAI(model="gpt-4o-mini")

    # Create MCP client
    mcp_client = MCPAgent(connector=connector, llm=llm, max_steps=30)

    # Run a query - MCPAgent handles connector lifecycle internally
    result = await mcp_client.run(
        "Using internet tell me how many people work at OpenAI",
        # manage_connector=True is the default
    )

    print("Result:")
    print(result)

if __name__ == "__main__":
    asyncio.run(main())

Advanced Usage

See the examples directory for more advanced usage examples:

  • basic_usage.py: Shows basic usage with different models
  • simplified_usage.py: Shows how to use automatic connector lifecycle management
  • websocket_example.py: Shows how to connect to a remote MCP over WebSocket

Requirements

  • Python 3.8+
  • MCP implementation (like Playwright MCP)
  • LangChain and appropriate model libraries (OpenAI, Anthropic, etc.)

License

MIT

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

mcpeer-0.1.0.tar.gz (16.0 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

mcpeer-0.1.0-py3-none-any.whl (19.8 kB view details)

Uploaded Python 3

File details

Details for the file mcpeer-0.1.0.tar.gz.

File metadata

  • Download URL: mcpeer-0.1.0.tar.gz
  • Upload date:
  • Size: 16.0 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.12.9

File hashes

Hashes for mcpeer-0.1.0.tar.gz
Algorithm Hash digest
SHA256 11e6743ee0e525f3515f1ea872e34749ae413bbd007c975fd8a48bb0e6b5b5b5
MD5 4c61b1eeed7efa7530ddc687c69f7378
BLAKE2b-256 22adb656c422956a1c00a34f44ab5a1e809420cad868afab0fcd9cf4ade88fb6

See more details on using hashes here.

Provenance

The following attestation bundles were made for mcpeer-0.1.0.tar.gz:

Publisher: publish.yml on pietrozullo/mcpeer

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file mcpeer-0.1.0-py3-none-any.whl.

File metadata

  • Download URL: mcpeer-0.1.0-py3-none-any.whl
  • Upload date:
  • Size: 19.8 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.12.9

File hashes

Hashes for mcpeer-0.1.0-py3-none-any.whl
Algorithm Hash digest
SHA256 9619e8fe55831f5a835ac5222a8524814736221dc0c72fb9e7c4e7bee738f17b
MD5 453b216a7435efa22429d894f0b771c2
BLAKE2b-256 20b91803e2372912a3517eeda9a5f706e0c44596828fa88113c472d1ddcbf511

See more details on using hashes here.

Provenance

The following attestation bundles were made for mcpeer-0.1.0-py3-none-any.whl:

Publisher: publish.yml on pietrozullo/mcpeer

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page