Skip to main content

A Python Library that wrapps around Anthropic's low-level MCP client

Project description

MCP Wrapper for AgentSociety

A library that wraps low-level MCP client functionality, exposing it for easy usage as tools within a LangChain workflow. This enables seamless integration of remote MCP tools for language model workflows and agent applications.

Structure

Library Code: mcpwrap/

  • client.py — Basic utility to invoke a LangChain model with a set of tools.
  • server_stub.py — Async abstraction (ServerSession) for connecting to and interfacing with an MCP server, exposing MCP tools for LangChain.
  • multi_mcp_model.py — Orchestrates multiple MCP server sessions and integrates their toolsets for use by one BaseChatModel. Handles async invocation and batching of tool requests.
  • llm_integration.py — Converts MCP tool schemas (JSON Schema) into dynamic, structured LangChain tools using Pydantic models.
  • (You may need to install mcp and LangChain dependencies.)

Example App: sample/

  • Provides a minimal, integration-tested example of how to use this library in practice. (See directory for details.)

Usage Overview

1. Connect MCP servers as tool providers

from mcpwrap.server_stub import ServerSession

session = ServerSession(name="example", url="http://localhost:8080")
await session.initialize()  # Async context

2. Compose Multi-server Tools for LangChain

from langchain_core.language_models.chat_models import BaseChatModel
from mcpwrap.multi_mcp_model import MultiMcpModel

base_model = ... # Any supported BaseChatModel
mcp_model = MultiMcpModel(
    base_model=base_model,
    mcp_servers=[session]  # Add as many as you like
)
await mcp_model.initialize()

3. Use in a Chat Workflow

results = await mcp_model.ainvoke(messages)  # messages: List[BaseMessage]

Features

  • Wraps multiple MCP servers: Maps their tools to a common namespace for use as LangChain tools.
  • Schema Conversion: Converts arbitrary MCP tool JSON schemas into Pydantic models for argument validation.
  • Async, batched operations: Tools are called concurrently when invoked by an LLM agent.
  • Error and retry handling: Optionally retries failed server/tool executions.

Requirements

  • pip install -r requirements.txt

Notes

  • Ensure MCP servers are reachable at their configured URLs.
  • Schema support currently assumes JSON Schema Draft 7 compatibility for tool arguments.

Contributing

PRs, issues, and questions are welcome!

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

agent_society_mcp_client_wrapper-0.0.2.tar.gz (6.2 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

File details

Details for the file agent_society_mcp_client_wrapper-0.0.2.tar.gz.

File metadata

File hashes

Hashes for agent_society_mcp_client_wrapper-0.0.2.tar.gz
Algorithm Hash digest
SHA256 60f42ba6faf2a1bb3544129095b79accbe227e18172548f87432006352bc0e3b
MD5 1fcce99adeba2fa3cefce7a009bff67b
BLAKE2b-256 496bfef32b9bc0ee25e7f7512d0bf2062e71f3b773430a0a1c1e1a72f3634406

See more details on using hashes here.

File details

Details for the file agent_society_mcp_client_wrapper-0.0.2-py3-none-any.whl.

File metadata

File hashes

Hashes for agent_society_mcp_client_wrapper-0.0.2-py3-none-any.whl
Algorithm Hash digest
SHA256 b3be2e18c98a0b9ca5de491a77779941b872dbc262008b199a6139ca63994440
MD5 f25e4f8941c2aacfc14434303b79e296
BLAKE2b-256 9413d2c3de54cf1cb7072d76a8fae075cacaef448cfd68d99b35054daafd1ce1

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page