Model-Agnostic MCP Library for LLMs
Project description
Model-Agnostic MCP Library for LLMs
A Python library that lets any LLM (Language Learning Model) use MCP (Multi-Channel Platform) tools through a unified interface. The goal is to let developers easily connect any LLM to tools like web browsing, file operations, etc.
Core Concept
- Leverage existing LangChain adapters rather than reinventing them
- Focus on bridging MCPs and LangChain's tool ecosystem
Key Components
Connectors
Bridge to MCP implementations:
stdio.py: For local MCP processeswebsocket.py: For remote WebSocket MCPshttp.py: For HTTP API MCPs
Tool Conversion
Convert between MCP and LangChain formats:
- Convert MCP tool schemas to formats needed by different LLMs
- Support OpenAI function calling, Anthropic tool format, etc.
Session Management
Handle connection lifecycle:
- Authenticate and initialize MCP connections
- Discover and register available tools
- Handle tool calling with proper error management
Agent Integration
Ready-to-use agent implementations:
- Pre-configured for MCP tool usage
- Optimized prompts for tool selection
Installation
pip install mcpeer
Or install from source:
git clone https://github.com/pietrozullo/mcpeer.git
cd mcpeer
pip install -e .
Quick Start
Here's a simple example to get you started:
import asyncio
from mcp import StdioServerParameters
from mcpeer import MCPAgent
async def main():
# Create server parameters for stdio connection
server_params = StdioServerParameters(
command="npx",
args=["@playwright/mcp@latest"],
)
# Create a model-agnostic MCP client
mcp_client = MCPAgent(
server_params=server_params,
model_provider="anthropic", # Or "openai"
model_name="claude-3-7-sonnet-20250219", # Or "gpt-4o" for OpenAI
temperature=0.7
)
# Initialize the client
await mcp_client.initialize()
# Run a query using the agent with tools
result = await mcp_client.run_query(
"Using internet tell me how many people work at OpenAI"
)
print("Result:")
print(result)
# Close the client
await mcp_client.close()
if __name__ == "__main__":
asyncio.run(main())
Simplified Usage
You can also use the simplified interface that handles connector lifecycle management automatically:
import asyncio
from langchain_openai import ChatOpenAI
from mcpeer import MCPAgent
from mcpeer.connectors.stdio import StdioConnector
async def main():
# Create the connector
connector = StdioConnector(
command="npx",
args=["@playwright/mcp@latest"],
)
# Create the LLM
llm = ChatOpenAI(model="gpt-4o-mini")
# Create MCP client
mcp_client = MCPAgent(connector=connector, llm=llm, max_steps=30)
# Run a query - MCPAgent handles connector lifecycle internally
result = await mcp_client.run(
"Using internet tell me how many people work at OpenAI",
# manage_connector=True is the default
)
print("Result:")
print(result)
if __name__ == "__main__":
asyncio.run(main())
Advanced Usage
See the examples directory for more advanced usage examples:
basic_usage.py: Shows basic usage with different modelssimplified_usage.py: Shows how to use automatic connector lifecycle managementwebsocket_example.py: Shows how to connect to a remote MCP over WebSocket
Requirements
- Python 3.8+
- MCP implementation (like Playwright MCP)
- LangChain and appropriate model libraries (OpenAI, Anthropic, etc.)
License
MIT
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file mcpeer-0.1.0.tar.gz.
File metadata
- Download URL: mcpeer-0.1.0.tar.gz
- Upload date:
- Size: 16.0 kB
- Tags: Source
- Uploaded using Trusted Publishing? Yes
- Uploaded via: twine/6.1.0 CPython/3.12.9
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
11e6743ee0e525f3515f1ea872e34749ae413bbd007c975fd8a48bb0e6b5b5b5
|
|
| MD5 |
4c61b1eeed7efa7530ddc687c69f7378
|
|
| BLAKE2b-256 |
22adb656c422956a1c00a34f44ab5a1e809420cad868afab0fcd9cf4ade88fb6
|
Provenance
The following attestation bundles were made for mcpeer-0.1.0.tar.gz:
Publisher:
publish.yml on pietrozullo/mcpeer
-
Statement:
-
Statement type:
https://in-toto.io/Statement/v1 -
Predicate type:
https://docs.pypi.org/attestations/publish/v1 -
Subject name:
mcpeer-0.1.0.tar.gz -
Subject digest:
11e6743ee0e525f3515f1ea872e34749ae413bbd007c975fd8a48bb0e6b5b5b5 - Sigstore transparency entry: 189397416
- Sigstore integration time:
-
Permalink:
pietrozullo/mcpeer@316548d028dd0a8d4b4d146754fa26e148d1744e -
Branch / Tag:
refs/tags/0.0.1 - Owner: https://github.com/pietrozullo
-
Access:
private
-
Token Issuer:
https://token.actions.githubusercontent.com -
Runner Environment:
github-hosted -
Publication workflow:
publish.yml@316548d028dd0a8d4b4d146754fa26e148d1744e -
Trigger Event:
release
-
Statement type:
File details
Details for the file mcpeer-0.1.0-py3-none-any.whl.
File metadata
- Download URL: mcpeer-0.1.0-py3-none-any.whl
- Upload date:
- Size: 19.8 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? Yes
- Uploaded via: twine/6.1.0 CPython/3.12.9
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
9619e8fe55831f5a835ac5222a8524814736221dc0c72fb9e7c4e7bee738f17b
|
|
| MD5 |
453b216a7435efa22429d894f0b771c2
|
|
| BLAKE2b-256 |
20b91803e2372912a3517eeda9a5f706e0c44596828fa88113c472d1ddcbf511
|
Provenance
The following attestation bundles were made for mcpeer-0.1.0-py3-none-any.whl:
Publisher:
publish.yml on pietrozullo/mcpeer
-
Statement:
-
Statement type:
https://in-toto.io/Statement/v1 -
Predicate type:
https://docs.pypi.org/attestations/publish/v1 -
Subject name:
mcpeer-0.1.0-py3-none-any.whl -
Subject digest:
9619e8fe55831f5a835ac5222a8524814736221dc0c72fb9e7c4e7bee738f17b - Sigstore transparency entry: 189397420
- Sigstore integration time:
-
Permalink:
pietrozullo/mcpeer@316548d028dd0a8d4b4d146754fa26e148d1744e -
Branch / Tag:
refs/tags/0.0.1 - Owner: https://github.com/pietrozullo
-
Access:
private
-
Token Issuer:
https://token.actions.githubusercontent.com -
Runner Environment:
github-hosted -
Publication workflow:
publish.yml@316548d028dd0a8d4b4d146754fa26e148d1744e -
Trigger Event:
release
-
Statement type: