Skip to main content

A minimal, lightweight client designed to simplify SDK (e.g., LangChain, Agent SDK) adoption into MCP.

Project description

✨ litemcp

A minimal, lightweight client designed to simplify SDK adoption into MCP.

litemcp enables rapid and intuitive integration of various AI SDKs (e.g., LangChain, Agent SDK) into your MCP projects, emphasizing simplicity, flexibility, and minimal dependencies.

🌟 Key Features

  • Simplicity: Streamlined interfaces ensure easy integration.
  • Flexibility: Quickly adopt diverse SDKs with minimal effort.
  • Lightweight: Designed with minimal dependencies to maximize clarity and performance.

🛠 Installation

Install via pip:

pip install litemcp

🚀 Quick Start

litemcp allows you to integrate tools from an MCP server into various LLM runtimes, including the OpenAI Agent SDK, LangChain, and direct OpenAI API calls.

Below are three examples showing how to use litemcp in different contexts:

✅ OpenAI Agent SDK Integration

async def main():
    async with MCPServerManager(sys.argv[1]) as server_manager:
        mcp_server_tools = await server_manager.agent_sdk_tools()

        agent = Agent(
            name="assistant",
            instructions="You are an AI assistant.",
            tools=mcp_server_tools,
        )

        result = await Runner.run(agent, "List all the kubernetes clusters")
        print(result.final_output)

if __name__ == "__main__":
    asyncio.run(main())

✅ LangChain Integration

async def main(config):
    chat = ChatOpenAI(model="gpt-3.5-turbo-0125")
    async with MCPServerManager(config) as server_manager:

        # bind tools
        tools: List[BaseTool] = await server_manager.langchain_tools()
        chat_with_tools = chat.bind_tools(tools, tool_choice="any")

        messages = [
            SystemMessage(content="You're a helpful assistant"),
            HumanMessage(content="List the dirs in the /Users"),
        ]
        tool_calls = chat_with_tools.invoke(messages).tool_calls

        # invoke the tool_call
        tool_map = {tool.name: tool for tool in tools}
        for tool_call in tool_calls:
            selected_tool = tool_map[tool_call["name"].lower()]
            tool_output = await selected_tool.ainvoke(tool_call["args"])
            print(tool_output)

✅ Direct OpenAI API Integration

async def main(config):
    client = OpenAI()

    async with MCPServerManager(config) as server_manager:
        schemas = await server_manager.schemas()

        completion = client.chat.completions.create(
            model="gpt-4o",
            messages=[{"role": "user", "content": "List the dirs in the /Users"}],
            tools=schemas,
        )

        print(completion.choices[0].message.tool_calls)

        # Execute the selected tool
        tool_call = completion.choices[0].message.tool_calls[0]
        result = await server_manager.tool_call(
            tool_call.function.name, tool_call.function.arguments
        )
        print(result.content[0].text)

🔐 Tool Call Validator(Optional)

You can add a custom validation function to control MCP tool calls. This helps prevent server tools from directly accessing your system without permission—such as integrating a human-in-the-loop step.

1. Define the Validator

def applier_validator(func_args) -> Optional[str]:
    """
    Return:
    - None: allow the tool call
    - str : block the tool call and return message instead
    """
    user_input = console.input(
        f"  🛠  Cluster - [yellow]{cluster}[/yellow] ⎈ Proceed with this YAML? (yes/no): "
    ).strip().lower()

    if user_input in {"yes", "y"}:
        return None
    if user_input in {"no", "n"}:
        console.print("[red]Exiting process.[/red]")
        sys.exit(0)
    return user_input

2. Register the Validator with MCP Server

async with MCPServerManager(sys.argv[1]) as server_manager:
    server_manager.register_validator("yaml_applier", applier_validator)

    mcp_server_tools = await server_manager.agent_sdk_tools()

    engineer = Agent(...)

📖 MCP Configuration Schema

Configure your MCP environment with optional server enabling and tool exclusion:

{
  "mcpServers": {
    "fetch": {
      "command": "uvx",
      "args": ["mcp-server-fetch"]
    },
    "youtube": {
      "command": "npx",
      "args": ["-y", "github:anaisbetts/mcp-youtube"],
      "exclude_tools": ["..."]
    },
    "mcp-server-commands": {
      "command": "npx",
      "args": ["mcp-server-commands"],
      "requires_confirmation": [
        "run_command",
        "run_script"
      ],
      "enabled": false
    },
     "multicluster-mcp-server": {
      "command": "node",
      "args": [".../multicluster-mcp-server/build/index.js"],
      "enabled": false
    }
  }
}
  • Use "enabled": true/false to activate or deactivate servers.
  • Use "exclude_tools" to omit unnecessary tools from the current MCP server.

📖 Documentation

Detailed documentation coming soon!

📢 Contributing

Contributions and suggestions are welcome! Please open an issue or submit a pull request.

📜 License

liteMCP is available under the MIT License.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

litemcp-0.1.1.tar.gz (7.4 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

litemcp-0.1.1-py3-none-any.whl (9.9 kB view details)

Uploaded Python 3

File details

Details for the file litemcp-0.1.1.tar.gz.

File metadata

  • Download URL: litemcp-0.1.1.tar.gz
  • Upload date:
  • Size: 7.4 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: poetry/2.1.1 CPython/3.11.11 Darwin/24.3.0

File hashes

Hashes for litemcp-0.1.1.tar.gz
Algorithm Hash digest
SHA256 ea68bc389be298b2077ec44a64645f3bccad86a0d0a4f85ab67edea18341af20
MD5 08740e529d88f3076a913cfa8a52cf13
BLAKE2b-256 07a594a0532c8e8b377534f1b19a7ac92ed00b73ced7130d0f4758648a7c327b

See more details on using hashes here.

File details

Details for the file litemcp-0.1.1-py3-none-any.whl.

File metadata

  • Download URL: litemcp-0.1.1-py3-none-any.whl
  • Upload date:
  • Size: 9.9 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: poetry/2.1.1 CPython/3.11.11 Darwin/24.3.0

File hashes

Hashes for litemcp-0.1.1-py3-none-any.whl
Algorithm Hash digest
SHA256 8f9774a7bbc912ecff2ffb84c8e0fdcc2193c855be932aab0eb7fb82efd61bf9
MD5 b351eacca55da8b53265e6cbc7ba2917
BLAKE2b-256 cbb2147347d11447f72af99ea537960c7b4ef8170fa8d669976aff6d86bf9299

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page