Skip to main content

Model Context Protocol (MCP) To LangChain Tools Conversion Utility

Project description

MCP To LangChain Tools Conversion Utility License: MIT pypi version

This package is intended to simplify the use of Model Context Protocol (MCP) server tools with LangChain / Python.

Model Context Protocol (MCP), an open standard announced by Anthropic, dramatically expands LLM’s scope by enabling external tool and resource integration, including GitHub, Google Drive, Slack, Notion, Spotify, Docker, PostgreSQL, and more…

MCP is likely to become the de facto industry standard as OpenAI has announced its adoption.

Over 2000 functional components available as MCP servers:

The goal of this utility is to make these 2000+ MCP servers readily accessible from LangChain.

It contains a utility function convert_mcp_to_langchain_tools().
This async function handles parallel initialization of specified multiple MCP servers and converts their available tools into a list of LangChain-compatible tools.

For detailed information on how to use this library, please refer to the following document:

A typescript equivalent of this utility is available here

Prerequisites

  • Python 3.11+

Installation

pip install langchain-mcp-tools

API docs

Can be found here

Quick Start

A minimal but complete working usage example can be found in this example in the langchain-mcp-tools-py-usage repo

convert_mcp_to_langchain_tools() utility function accepts MCP server configurations that follow the same structure as Claude for Desktop, but only the contents of the mcpServers property, and is expressed as a dict, e.g.:

mcp_servers = {
    "filesystem": {
        "command": "npx",
        "args": ["-y", "@modelcontextprotocol/server-filesystem", "."]
    },
    "fetch": {
        "command": "uvx",
        "args": ["mcp-server-fetch"]
    }
}

tools, cleanup = await convert_mcp_to_langchain_tools(
    mcp_servers
)

This utility function initializes all specified MCP servers in parallel, and returns LangChain Tools (tools: list[BaseTool]) by gathering available MCP tools from the servers, and by wrapping them into LangChain tools. It also returns an async callback function (cleanup: McpServerCleanupFn) to be invoked to close all MCP server sessions when finished.

The returned tools can be used with LangChain, e.g.:

# from langchain.chat_models import init_chat_model
llm = init_chat_model(
    model="claude-3-7-sonnet-latest",
    model_provider="anthropic"
)

# from langgraph.prebuilt import create_react_agent
agent = create_react_agent(
    llm,
    tools
)

For hands-on experimentation with MCP server integration, try this LangChain application built with the utility

For detailed information on how to use this library, please refer to the following document:
"Supercharging LangChain: Integrating 2000+ MCP with ReAct"

Experimental Features

Remote MCP Server Support

mcp_servers configuration for SSE and Websocket servers are as follows:

    "sse-server-name": {
        "url": f"http://{sse_server_host}:{sse_server_port}/..."
    },

    "ws-server-name": {
        "url": f"ws://{ws_server_host}:{ws_server_port}/..."
    },

Note that the key "url" may be changed in the future to match the MCP server configurations used by Claude for Desktop once it introduces remote server support.

A usage example can be found here

Authentication Support for SSE Connections

A new key "headers" has been introduced to pass HTTP headers to the SSE (Server-Sent Events) connection.
It takes dict[str, str] and is primarily intended to support SSE MCP servers that require authentication via bearer tokens or other custom headers.

    "sse-server-name": {
        "url": f"http://{sse_server_host}:{sse_server_port}/..."
        "headers": {"Authorization": f"Bearer {bearer_token}"}
    },

The key name header is derived from the Python SDK sse_client() argument name.

A simple example showing how to implement MCP SSE server and client with authentication can be found in sse-auth-test-client.py and in sse-auth-test-server.py of this usage examples repo.

Working Directory Configuration for Local MCP Servers

The working directory that is used when spawning a local (stdio) MCP server can be specified with the "cwd" key as follows:

    "local-server-name": {
        "command": "...",
        "args": [...],
        "cwd": "/working/directory"  # the working dir to be use by the server
    },

The key name cwd is derived from Python SDK's StdioServerParameters.

stderr Redirection for Local MCP Server

A new key "errlog" has been introduced to specify a file-like object to which local (stdio) MCP server's stderr is redirected.

    log_path = f"mcp-server-{server_name}.log"
    log_file = open(log_path, "w")
    mcp_servers[server_name]["errlog"] = log_file

A usage example can be found here

NOTE: Why the key name errlog was chosen:
Unlike TypeScript SDK's StdioServerParameters, the Python SDK's StdioServerParameters doesn't include stderr: int.
Instead, it calls stdio_client() with a separate argument errlog: TextIO.
I once included stderr: int for compatibility with the TypeScript version, but decided to follow the Python SDK more closely.

Limitations

  • Currently, only text results of tool calls are supported.
  • MCP features other than Tools are not supported.

Change Log

Can be found here

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

langchain_mcp_tools-0.2.3.tar.gz (13.6 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

langchain_mcp_tools-0.2.3-py3-none-any.whl (10.4 kB view details)

Uploaded Python 3

File details

Details for the file langchain_mcp_tools-0.2.3.tar.gz.

File metadata

  • Download URL: langchain_mcp_tools-0.2.3.tar.gz
  • Upload date:
  • Size: 13.6 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.1.0 CPython/3.12.8

File hashes

Hashes for langchain_mcp_tools-0.2.3.tar.gz
Algorithm Hash digest
SHA256 8d3ebb66e8517fa03acc171ded90cf699bbda3aa5e3342095a5178e153c8183b
MD5 ee04d2a20a4609fddac2c41f8284e85a
BLAKE2b-256 d0b595127d8b6b160455b43ae44fee780859ada541ab83c5e6eca881c0e0b2f7

See more details on using hashes here.

File details

Details for the file langchain_mcp_tools-0.2.3-py3-none-any.whl.

File metadata

File hashes

Hashes for langchain_mcp_tools-0.2.3-py3-none-any.whl
Algorithm Hash digest
SHA256 44a0b23671d5d58119d8a612185036e88354f9715bb4bf8caf712192784838c9
MD5 a12cf5fda08f3eac6c189b9c4feced99
BLAKE2b-256 58620025a64ff99f33315bd715777cda8d664b381d0041846e1a01ee1cd634a7

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page