Skip to main content

A Python library which wraps around Anthropic's low-level MCP client

Project description

MCP Wrapper for AgentSociety

A library that wraps low-level MCP client functionality, exposing it for easy usage as tools within a LangChain workflow. This enables seamless integration of remote MCP tools for language model workflows and agent applications.

Structure

Library Code: mcpwrap/

  • client.py — Basic utility to invoke a LangChain model with a set of tools.
  • server_stub.py — Async abstraction (ServerSession) for connecting to and interfacing with an MCP server, exposing MCP tools for LangChain.
  • multi_mcp_model.py — Orchestrates multiple MCP server sessions and integrates their toolsets for use by one BaseChatModel. Handles async invocation and batching of tool requests.
  • llm_integration.py — Converts MCP tool schemas (JSON Schema) into dynamic, structured LangChain tools using Pydantic models.
  • (You may need to install mcp and LangChain dependencies.)

Example App: sample/

  • Provides a minimal, integration-tested example of how to use this library in practice. (See directory for details.)

Usage Overview

1. Connect MCP servers as tool providers

from mcpwrap.server_stub import ServerSession

session = ServerSession(name="example", url="http://localhost:8080")
await session.initialize()  # Async context

2. Compose Multi-server Tools for LangChain

from langchain_core.language_models.chat_models import BaseChatModel
from mcpwrap.multi_mcp_model import MultiMcpModel

base_model = ... # Any supported BaseChatModel
mcp_model = MultiMcpModel(
    base_model=base_model,
    mcp_servers=[session]  # Add as many as you like
)
await mcp_model.initialize()

3. Use in a Chat Workflow

results = await mcp_model.ainvoke(messages)  # messages: List[BaseMessage]

Features

  • Wraps multiple MCP servers: Maps their tools to a common namespace for use as LangChain tools.
  • Schema Conversion: Converts arbitrary MCP tool JSON schemas into Pydantic models for argument validation.
  • Async, batched operations: Tools are called concurrently when invoked by an LLM agent.
  • Error and retry handling: Optionally retries failed server/tool executions.

Requirements

  • pip install -r requirements.txt

Notes

  • Ensure MCP servers are reachable at their configured URLs.
  • Schema support currently assumes JSON Schema Draft 7 compatibility for tool arguments.

Contributing

PRs, issues, and questions are welcome!

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

agent_society_mcp_client_wrapper-0.0.3.tar.gz (6.2 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

File details

Details for the file agent_society_mcp_client_wrapper-0.0.3.tar.gz.

File metadata

File hashes

Hashes for agent_society_mcp_client_wrapper-0.0.3.tar.gz
Algorithm Hash digest
SHA256 5762186744b21c38d422eeb63a50f48f12c4b747faa078d24ad73cadb23da79b
MD5 03c0819ef82f57047b79583b029e2267
BLAKE2b-256 75365967029068f78315e396f87bb5ba497bba965360b132b3d14ff96bb01f8b

See more details on using hashes here.

File details

Details for the file agent_society_mcp_client_wrapper-0.0.3-py3-none-any.whl.

File metadata

File hashes

Hashes for agent_society_mcp_client_wrapper-0.0.3-py3-none-any.whl
Algorithm Hash digest
SHA256 b73de219af486af7ca3b8d31f8339d2f86c921a158e3123b06280ac07ee0f56b
MD5 df1c2d0a6535f598320359baefdf46fa
BLAKE2b-256 f284529efd36ece111f9f73360ca15c39e083b65b9d6cec0179bdc466f275370

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page