Skip to main content

Model Context Protocol (MCP) To LangChain Tools Conversion Utility

Project description

MCP To LangChain Tools Conversion Utility License: MIT pypi version

This package is intended to simplify the use of Model Context Protocol (MCP) server tools with LangChain / Python.

Model Context Protocol (MCP), an open source technology announced by Anthropic, dramatically expands LLM’s scope by enabling external tool and resource integration, including Google Drive, Slack, Notion, Spotify, Docker, PostgreSQL, and more…

Over 2000 functional components available as MCP servers:

The goal of this utility is to make these 2000+ MCP servers readily accessible from LangChain.

It contains a utility function convert_mcp_to_langchain_tools().
This async function handles parallel initialization of specified multiple MCP servers and converts their available tools into a list of LangChain-compatible tools.

For detailed information on how to use this library, please refer to the following document:

A typescript equivalent of this utility is available here

Prerequisites

  • Python 3.11+

Installation

pip install langchain-mcp-tools

Quick Start

convert_mcp_to_langchain_tools() utility function accepts MCP server configurations that follow the same structure as Claude for Desktop, but only the contents of the mcpServers property, and is expressed as a dict, e.g.:

mcp_servers = {
    "filesystem": {
        "command": "npx",
        "args": ["-y", "@modelcontextprotocol/server-filesystem", "."]
    },
    "fetch": {
        "command": "uvx",
        "args": ["mcp-server-fetch"]
    }
}

tools, cleanup = await convert_mcp_to_langchain_tools(
    mcp_servers
)

This utility function initializes all specified MCP servers in parallel, and returns LangChain Tools (tools: list[BaseTool]) by gathering available MCP tools from the servers, and by wrapping them into LangChain tools. It also returns an async callback function (cleanup: McpServerCleanupFn) to be invoked to close all MCP server sessions when finished.

The returned tools can be used with LangChain, e.g.:

# from langchain.chat_models import init_chat_model
llm = init_chat_model(
    model='claude-3-7-sonnet-latest',
    model_provider='anthropic'
)

# from langgraph.prebuilt import create_react_agent
agent = create_react_agent(
    llm,
    tools
)

Find complete, minimal working usage examples here

For hands-on experimentation with MCP server integration, try this LangChain application built with the utility

For detailed information on how to use this library, please refer to the following document:
"Supercharging LangChain: Integrating 2000+ MCP with ReAct"

Experimental Features

Remote MCP Server Support

mcp_servers configuration for SSE and Websocket servers are as follows:

    "sse-server-name": {
        "url": f"http://{sse_server_host}:{sse_server_port}/..."
    },

    "ws-server-name": {
        "url": f"ws://{ws_server_host}:{ws_server_port}/..."
    },

Note that the key name "url" may be changed in the future to match the MCP server configurations used by Claude for Desktop once it introduces remote server support.

Working Directory Configuration for Local MCP Servers

The working directory that is used when spawning a local MCP server can be specified with the cwd key as follows:

    "local-server-name": {
        "command": "...",
        "args": [...],
        "cwd": "/working/directory"  # the working dir to be use by the server
    },

Configuration for MCP Server stderr Redirection

A new key errlog has been introduced in to specify a file-like object to which MCP server's stderr is redirected.

    log_path = f"mcp-server-{server_name}.log"
    log_file = open(log_path, "w")
    mcp_servers[server_name]["errlog"] = log_file

NOTE: Why the key name errlog for server_config was chosen:
Unlike TypeScript SDK's StdioServerParameters, the Python SDK's StdioServerParameters doesn't include stderr: int. Instead, it calls stdio_client() with a separate argument errlog: TextIO. I once included stderr: int for compatibility with the TypeScript version, but decided to follow the Python SDK more closely.

Limitations

  • Currently, only text results of tool calls are supported.
  • Fatures other than Tools are not supported.

Change Log

Can be found here

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

langchain_mcp_tools-0.2.0.tar.gz (12.2 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

langchain_mcp_tools-0.2.0-py3-none-any.whl (9.7 kB view details)

Uploaded Python 3

File details

Details for the file langchain_mcp_tools-0.2.0.tar.gz.

File metadata

  • Download URL: langchain_mcp_tools-0.2.0.tar.gz
  • Upload date:
  • Size: 12.2 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.1.0 CPython/3.12.8

File hashes

Hashes for langchain_mcp_tools-0.2.0.tar.gz
Algorithm Hash digest
SHA256 74d3f4e93ee1027d105d342481f38b8cb4489d293bb5db29a2c4247788e44b23
MD5 78c54ed10b4e4cd1c50d4c241833c07d
BLAKE2b-256 52e0a48ffeb427e2af85bea2490f598c1f626895f9c138c9550c4e5c31de06d6

See more details on using hashes here.

File details

Details for the file langchain_mcp_tools-0.2.0-py3-none-any.whl.

File metadata

File hashes

Hashes for langchain_mcp_tools-0.2.0-py3-none-any.whl
Algorithm Hash digest
SHA256 0bd55e8fd428c3d9f76b100e4feb7c907940b19d0106e0065ebb9aea494e6967
MD5 940b8a17da8799d48463039597f1aec6
BLAKE2b-256 847bdab08b3487247ab10b4439c478e13b55f06e77c5653205f4874ef1a7874e

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page