Skip to main content

A Python Library that wrapps around the low-level mcp client lib by Anthropic

Project description

MCP Wrapper for AgentSociety

A library that wraps low-level MCP client functionality, exposing it for easy usage as tools within a LangChain workflow. This enables seamless integration of remote MCP tools for language model workflows and agent applications.

Structure

Library Code: mcpwrap/

  • client.py — Basic utility to invoke a LangChain model with a set of tools.
  • server_stub.py — Async abstraction (ServerSession) for connecting to and interfacing with an MCP server, exposing MCP tools for LangChain.
  • multi_mcp_model.py — Orchestrates multiple MCP server sessions and integrates their toolsets for use by one BaseChatModel. Handles async invocation and batching of tool requests.
  • llm_integration.py — Converts MCP tool schemas (JSON Schema) into dynamic, structured LangChain tools using Pydantic models.
  • (You may need to install mcp and LangChain dependencies.)

Example App: sample/

  • Provides a minimal, integration-tested example of how to use this library in practice. (See directory for details.)

Usage Overview

1. Connect MCP servers as tool providers

from mcpwrap.server_stub import ServerSession

session = ServerSession(name="example", url="http://localhost:8080")
await session.initialize()  # Async context

2. Compose Multi-server Tools for LangChain

from langchain_core.language_models.chat_models import BaseChatModel
from mcpwrap.multi_mcp_model import MultiMcpModel

base_model = ... # Any supported BaseChatModel
mcp_model = MultiMcpModel(
    base_model=base_model,
    mcp_servers=[session]  # Add as many as you like
)
await mcp_model.initialize()

3. Use in a Chat Workflow

results = await mcp_model.ainvoke(messages)  # messages: List[BaseMessage]

Features

  • Wraps multiple MCP servers: Maps their tools to a common namespace for use as LangChain tools.
  • Schema Conversion: Converts arbitrary MCP tool JSON schemas into Pydantic models for argument validation.
  • Async, batched operations: Tools are called concurrently when invoked by an LLM agent.
  • Error and retry handling: Optionally retries failed server/tool executions.

Requirements

  • pip install -r requirements.txt

Notes

  • Ensure MCP servers are reachable at their configured URLs.
  • Schema support currently assumes JSON Schema Draft 7 compatibility for tool arguments.

Contributing

PRs, issues, and questions are welcome!

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

agent_society_mcp_client_wrapper-0.0.1.tar.gz (6.2 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

File details

Details for the file agent_society_mcp_client_wrapper-0.0.1.tar.gz.

File metadata

File hashes

Hashes for agent_society_mcp_client_wrapper-0.0.1.tar.gz
Algorithm Hash digest
SHA256 c64bf8e76a34d33f9f28b4c3e3ab45a3bb20ce80b167e8a9f9e37c39dd43bcab
MD5 46c6c87dae4374236fcef0651931b0b8
BLAKE2b-256 6912784ea7cb9d2f3d20087ee1009dddfc0d9728a2faa61fa597ae43f20bcbe2

See more details on using hashes here.

File details

Details for the file agent_society_mcp_client_wrapper-0.0.1-py3-none-any.whl.

File metadata

File hashes

Hashes for agent_society_mcp_client_wrapper-0.0.1-py3-none-any.whl
Algorithm Hash digest
SHA256 44d174e54b6c1a457c59884970719451f2481173b76ebb77f5e06558e56bed36
MD5 bc8e9923a92c29c523e8fcc0ac712d4b
BLAKE2b-256 aa7085328ba4de37850fee3d41fab9c6e302bf5065b64cb04bd1b03c8517d92a

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page