Skip to main content

Run Model Context Protocol (MCP) servers with AWS Lambda

Project description

Run Model Context Protocol (MCP) servers with AWS Lambda

PyPI - Downloads NPM Downloads

This project enables you to run Model Context Protocol stdio-based servers in AWS Lambda functions.

Currently, most implementations of MCP servers and clients are entirely local on a single machine. A desktop application such as an IDE or Claude Desktop initiates MCP servers locally as child processes and communicates with each of those servers over a long-running stdio stream.

flowchart LR
    subgraph "Your Laptop"
        Host["Desktop Application<br>with MCP Clients"]
        S1["MCP Server A<br>(child process)"]
        S2["MCP Server B<br>(child process)"]
        Host <-->|"MCP Protocol<br>(over stdio stream)"| S1
        Host <-->|"MCP Protocol<br>(over stdio stream)"| S2
    end

This library helps you to wrap existing stdio MCP servers into Lambda functions. You can invoke these function-based MCP servers from your application using the MCP protocol over short-lived HTTPS connections. Your application can then be a desktop-based app, a distributed system running in the cloud, or any other architecture.

flowchart LR
    subgraph "Distributed System"
        App["Your Application<br>with MCP Clients"]
        S3["MCP Server A<br>(Lambda function)"]
        S4["MCP Server B<br>(Lambda function)"]
        App <-->|"MCP Protocol<br>(over HTTPS connection)"| S3
        App <-->|"MCP Protocol<br>(over HTTPS connection)"| S4
    end

Using this library, the Lambda function will manage the lifecycle of your stdio MCP server. Each Lambda function invocation will:

  1. Start the stdio MCP server as a child process
  2. Initialize the MCP server
  3. Forward the incoming request to the local server
  4. Return the server's response to the function caller
  5. Shut down the MCP server child process

This library supports connecting to Lambda-based MCP servers in four ways:

  1. The MCP Streamable HTTP transport, using Amazon API Gateway. Typically authenticated using OAuth.
  2. The MCP Streamable HTTP transport, using Amazon Bedrock AgentCore Gateway (currently in Preview). Authenticated using OAuth.
  3. A custom Streamable HTTP transport with support for SigV4, using a Lambda function URL. Authenticated with AWS IAM.
  4. A custom Lambda invocation transport, using the Lambda Invoke API directly. Authenticated with AWS IAM.

Use API Gateway

flowchart LR
    App["MCP Client"]
    T1["MCP Server<br>(Lambda function)"]
    T2["API Gateway"]
    T3["OAuth Server<br>(Cognito or similar)"]
    App -->|"MCP Streamable<br>HTTP Transport"| T2
    T2 -->|"Invoke"| T1
    T2 -->|"Authorize"| T3

This solution is compatible with most MCP clients that support the streamable HTTP transport. MCP servers deployed with this architecture can typically be used with off-the-shelf MCP-compatible applications such as Cursor, Cline, Claude Desktop, etc.

You can choose your desired OAuth server provider for this solution. The examples in this repository use Amazon Cognito, or you can use third-party providers such as Okta or Auth0 with API Gateway custom authorization.

Python server example
import sys
from mcp.client.stdio import StdioServerParameters
from mcp_lambda import APIGatewayProxyEventHandler, StdioServerAdapterRequestHandler

server_params = StdioServerParameters(
    command=sys.executable,
    args=[
        "-m",
        "my_mcp_server_python_module",
        "--my-server-command-line-parameter",
        "some_value",
    ],
)


request_handler = StdioServerAdapterRequestHandler(server_params)
event_handler = APIGatewayProxyEventHandler(request_handler)


def handler(event, context):
    return event_handler.handle(event, context)

See a full, deployable example here.

Typescript server example
import {
  Handler,
  Context,
  APIGatewayProxyWithCognitoAuthorizerEvent,
  APIGatewayProxyResult,
} from "aws-lambda";
import {
  APIGatewayProxyEventHandler,
  StdioServerAdapterRequestHandler,
} from "@aws/run-mcp-servers-with-aws-lambda";

const serverParams = {
  command: "npx",
  args: [
    "--offline",
    "my-mcp-server-typescript-module",
    "--my-server-command-line-parameter",
    "some_value",
  ],
};

const requestHandler = new APIGatewayProxyEventHandler(
  new StdioServerAdapterRequestHandler(serverParams)
);

export const handler: Handler = async (
  event: APIGatewayProxyWithCognitoAuthorizerEvent,
  context: Context
): Promise<APIGatewayProxyResult> => {
  return requestHandler.handle(event, context);
};

See a full, deployable example here.

Python client example
from mcp import ClientSession
from mcp.client.streamable_http import streamablehttp_client

# Create OAuth client provider here

async with streamablehttp_client(
    url="https://abc123.execute-api.us-west-2.amazonaws.com/prod/mcp",
    auth=oauth_client_provider,
) as (
    read_stream,
    write_stream,
    _,
):
    async with ClientSession(read_stream, write_stream) as session:
        await session.initialize()
        tool_result = await session.call_tool("echo", {"message": "hello"})

See a full example as part of the sample chatbot here.

Typescript client example
import { StreamableHTTPClientTransport } from "@modelcontextprotocol/sdk/client/streamableHttp.js";
import { Client } from "@modelcontextprotocol/sdk/client/index.js";

const client = new Client(
  {
    name: "my-client",
    version: "0.0.1",
  },
  {
    capabilities: {
      sampling: {},
    },
  }
);

// Create OAuth client provider here

const transport = new StreamableHTTPClientTransport(
  "https://abc123.execute-api.us-west-2.amazonaws.com/prod/mcp",
  {
    authProvider: oauthProvider,
  }
);
await client.connect(transport);

See a full example as part of the sample chatbot here.

Use Bedrock AgentCore Gateway

flowchart LR
    App["MCP Client"]
    T1["MCP Server<br>(Lambda function)"]
    T2["Bedrock AgentCore Gateway"]
    T3["OAuth Server<br>(Cognito or similar)"]
    App -->|"MCP Streamable<br>HTTP Transport"| T2
    T2 -->|"Invoke"| T1
    T2 -->|"Authorize"| T3

This solution is compatible with most MCP clients that support the streamable HTTP transport. MCP servers deployed with this architecture can typically be used with off-the-shelf MCP-compatible applications such as Cursor, Cline, Claude Desktop, etc.

You can choose your desired OAuth server provider with Bedrock AgentCore Gateway, such as Amazon Cognito, Okta, or Auth0.

Using Bedrock AgentCore Gateway in front of your stdio-based MCP server requires that you retrieve the MCP server's tool schema, and provide it in the AgentCore Gateway Lambda target configuration. AgentCore Gateway can then advertise the schema to HTTP clients and validate request inputs and outputs.

To retrieve and save your stdio-based MCP server's tool schema to a file, run:

npx @modelcontextprotocol/inspector --cli --method tools/list <your MCP server command and arguments> > tool-schema.json

# For example:
npx @modelcontextprotocol/inspector --cli --method tools/list uvx mcp-server-time > tool-schema.json
Python server example
import sys
from mcp.client.stdio import StdioServerParameters
from mcp_lambda import BedrockAgentCoreGatewayTargetHandler, StdioServerAdapterRequestHandler

server_params = StdioServerParameters(
    command=sys.executable,
    args=[
        "-m",
        "my_mcp_server_python_module",
        "--my-server-command-line-parameter",
        "some_value",
    ],
)


request_handler = StdioServerAdapterRequestHandler(server_params)
event_handler = BedrockAgentCoreGatewayTargetHandler(request_handler)


def handler(event, context):
    return event_handler.handle(event, context)

See a full, deployable example here.

Typescript server example
import { Handler, Context } from "aws-lambda";
import {
  BedrockAgentCoreGatewayTargetHandler,
  StdioServerAdapterRequestHandler,
} from "@aws/run-mcp-servers-with-aws-lambda";

const serverParams = {
  command: "npx",
  args: [
    "--offline",
    "my-mcp-server-typescript-module",
    "--my-server-command-line-parameter",
    "some_value",
  ],
};

const requestHandler = new BedrockAgentCoreGatewayTargetHandler(
  new StdioServerAdapterRequestHandler(serverParams)
);

export const handler: Handler = async (
  event: Record<string, unknown>,
  context: Context
): Promise<Record<string, unknown>> => {
  return requestHandler.handle(event, context);
};

See a full, deployable example here.

Python client example
from mcp import ClientSession
from mcp.client.streamable_http import streamablehttp_client

# Create OAuth client provider here

async with streamablehttp_client(
    url="https://abc123.gateway.bedrock-agentcore.us-west-2.amazonaws.com/mcp",
    auth=oauth_client_provider,
) as (
    read_stream,
    write_stream,
    _,
):
    async with ClientSession(read_stream, write_stream) as session:
        await session.initialize()
        tool_result = await session.call_tool("echo", {"message": "hello"})

See a full example as part of the sample chatbot here.

Typescript client example
import { StreamableHTTPClientTransport } from "@modelcontextprotocol/sdk/client/streamableHttp.js";
import { Client } from "@modelcontextprotocol/sdk/client/index.js";

const client = new Client(
  {
    name: "my-client",
    version: "0.0.1",
  },
  {
    capabilities: {
      sampling: {},
    },
  }
);

// Create OAuth client provider here

const transport = new StreamableHTTPClientTransport(
  "https://abc123.gateway.bedrock-agentcore.us-west-2.amazonaws.com/mcp",
  {
    authProvider: oauthProvider,
  }
);
await client.connect(transport);

See a full example as part of the sample chatbot here.

Use a Lambda function URL

flowchart LR
    App["MCP Client"]
    T1["MCP Server<br>(Lambda function)"]
    T2["Lambda function URL"]
    App -->|"Custom Streamable HTTP<br>Transport with AWS Auth"| T2
    T2 -->|"Invoke"| T1

This solution uses AWS IAM for authentication, and relies on granting Lambda InvokeFunctionUrl permission to your IAM users and roles to enable access to the MCP server. Clients must use an extension to the MCP Streamable HTTP transport that signs requests with AWS SigV4. Off-the-shelf MCP-compatible applications are unlikely to have support for this custom transport, so this solution is more appropriate for service-to-service communication rather than for end users.

Python server example
import sys
from mcp.client.stdio import StdioServerParameters
from mcp_lambda import LambdaFunctionURLEventHandler, StdioServerAdapterRequestHandler

server_params = StdioServerParameters(
    command=sys.executable,
    args=[
        "-m",
        "my_mcp_server_python_module",
        "--my-server-command-line-parameter",
        "some_value",
    ],
)


request_handler = StdioServerAdapterRequestHandler(server_params)
event_handler = LambdaFunctionURLEventHandler(request_handler)


def handler(event, context):
    return event_handler.handle(event, context)

See a full, deployable example here.

Typescript server example
import {
  Handler,
  Context,
  APIGatewayProxyEventV2WithIAMAuthorizer,
  APIGatewayProxyResultV2,
} from "aws-lambda";
import {
  LambdaFunctionURLEventHandler,
  StdioServerAdapterRequestHandler,
} from "@aws/run-mcp-servers-with-aws-lambda";

const serverParams = {
  command: "npx",
  args: [
    "--offline",
    "my-mcp-server-typescript-module",
    "--my-server-command-line-parameter",
    "some_value",
  ],
};

const requestHandler = new LambdaFunctionURLEventHandler(
  new StdioServerAdapterRequestHandler(serverParams)
);

export const handler: Handler = async (
  event: APIGatewayProxyEventV2WithIAMAuthorizer,
  context: Context
): Promise<APIGatewayProxyResultV2> => {
  return requestHandler.handle(event, context);
};

See a full, deployable example here.

Python client example
from mcp import ClientSession
from mcp_lambda.client.streamable_http_sigv4 import streamablehttp_client_with_sigv4

async with streamablehttp_client_with_sigv4(
    url="https://url-id-12345.lambda-url.us-west-2.on.aws",
    service="lambda",
    region="us-west-2",
) as (
    read_stream,
    write_stream,
    _,
):
    async with ClientSession(read_stream, write_stream) as session:
        await session.initialize()
        tool_result = await session.call_tool("echo", {"message": "hello"})

See a full example as part of the sample chatbot here.

Typescript client example
import { StreamableHTTPClientWithSigV4Transport } from "@aws/run-mcp-servers-with-aws-lambda";
import { Client } from "@modelcontextprotocol/sdk/client/index.js";

const client = new Client(
  {
    name: "my-client",
    version: "0.0.1",
  },
  {
    capabilities: {
      sampling: {},
    },
  }
);

const transport = new StreamableHTTPClientWithSigV4Transport(
  new URL("https://url-id-12345.lambda-url.us-west-2.on.aws"),
  {
    service: "lambda",
    region: "us-west-2",
  }
);
await client.connect(transport);

See a full example as part of the sample chatbot here.

Use the Lambda Invoke API

flowchart LR
    App["MCP Client"]
    T1["MCP Server<br>(Lambda function)"]
    App -->|"Custom MCP Transport<br>(Lambda Invoke API)"| T1

Like the Lambda function URL approach, this solution uses AWS IAM for authentication. It relies on granting Lambda InvokeFunction permission to your IAM users and roles to enable access to the MCP server. Clients must use a custom MCP transport that directly calls the Lambda Invoke API. Off-the-shelf MCP-compatible applications are unlikely to have support for this custom transport, so this solution is more appropriate for service-to-service communication rather than for end users.

Python server example
import sys
from mcp.client.stdio import StdioServerParameters
from mcp_lambda import stdio_server_adapter

server_params = StdioServerParameters(
    command=sys.executable,
    args=[
        "-m",
        "my_mcp_server_python_module",
        "--my-server-command-line-parameter",
        "some_value",
    ],
)


def handler(event, context):
    return stdio_server_adapter(server_params, event, context)

See a full, deployable example here.

Typescript server example
import { Handler, Context } from "aws-lambda";
import { stdioServerAdapter } from "@aws/run-mcp-servers-with-aws-lambda";

const serverParams = {
  command: "npx",
  args: [
    "--offline",
    "my-mcp-server-typescript-module",
    "--my-server-command-line-parameter",
    "some_value",
  ],
};

export const handler: Handler = async (event, context: Context) => {
  return await stdioServerAdapter(serverParams, event, context);
};

See a full, deployable example here.

Python client example
from mcp import ClientSession
from mcp_lambda import LambdaFunctionParameters, lambda_function_client

server_params = LambdaFunctionParameters(
    function_name="my-mcp-server-function",
    region_name="us-west-2",
)

async with lambda_function_client(server_params) as (
    read_stream,
    write_stream,
):
    async with ClientSession(read_stream, write_stream) as session:
        await session.initialize()
        tool_result = await session.call_tool("echo", {"message": "hello"})

See a full example as part of the sample chatbot here.

Typescript client example
import {
  LambdaFunctionParameters,
  LambdaFunctionClientTransport,
} from "@aws/run-mcp-servers-with-aws-lambda";
import { Client } from "@modelcontextprotocol/sdk/client/index.js";

const serverParams: LambdaFunctionParameters = {
  functionName: "my-mcp-server-function",
  regionName: "us-west-2",
};

const client = new Client(
  {
    name: "my-client",
    version: "0.0.1",
  },
  {
    capabilities: {
      sampling: {},
    },
  }
);

const transport = new LambdaFunctionClientTransport(serverParams);
await client.connect(transport);

See a full example as part of the sample chatbot here.

Related projects

Considerations

  • This library currently supports MCP servers and clients written in Python and Typescript. Other languages such as Kotlin are not supported.
  • This library only adapts stdio MCP servers for Lambda, not servers written for other protocols such as SSE.
  • This library does not maintain any MCP server state or sessions across Lambda function invocations. Only stateless MCP servers are a good fit for using this library. For example, MCP servers that invoke stateless tools like the time MCP server or make stateless web requests like the fetch MCP server. Stateful MCP servers are not a good fit, because they will lose their state on every request. For example, MCP servers that manage data on disk or in memory such as the sqlite MCP server, the filesystem MCP server, and the git MCP server.
  • This library does not provide mechanisms for managing any secrets needed by the wrapped MCP server. For example, the GitHub MCP server and the Brave search MCP server require API keys to make requests to third-party APIs. You may configure these API keys as encrypted environment variables in the Lambda function's configuration. However, note that anyone with access to invoke the Lambda function will then have access to use your API key to call the third-party APIs by invoking the function. We recommend limiting access to the Lambda function using least-privilege IAM policies. If you use an identity-based authentication mechanism such as OAuth, you could also store and retrieve API keys per user but there are no implementation examples in this repository.

Deploy and run the examples

See the development guide for instructions to deploy and run the examples in this repository.

Security

See CONTRIBUTING for more information.

License

This project is licensed under the Apache-2.0 License.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

run_mcp_servers_with_aws_lambda-0.4.3.tar.gz (107.2 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

File details

Details for the file run_mcp_servers_with_aws_lambda-0.4.3.tar.gz.

File metadata

File hashes

Hashes for run_mcp_servers_with_aws_lambda-0.4.3.tar.gz
Algorithm Hash digest
SHA256 eae844cefdf97eefa40223488dcbb0296b033026287d1ade70191535a1b5c88a
MD5 7ee748d6d66f48a3b99f662cb4177c90
BLAKE2b-256 fd31e5f4635285f27e590797615657d3a68eade6bb8da2bb5003d9dc1be8bacf

See more details on using hashes here.

Provenance

The following attestation bundles were made for run_mcp_servers_with_aws_lambda-0.4.3.tar.gz:

Publisher: release.yml on awslabs/run-model-context-protocol-servers-with-aws-lambda

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file run_mcp_servers_with_aws_lambda-0.4.3-py3-none-any.whl.

File metadata

File hashes

Hashes for run_mcp_servers_with_aws_lambda-0.4.3-py3-none-any.whl
Algorithm Hash digest
SHA256 424d4fa3e512d7f073c1ae37a8d8e457ffeec45762160ce535f6ed7c3f0941ef
MD5 93f74544e749d4980affdb8fe04b4c8b
BLAKE2b-256 20ab612d5c23e3797926184f515ddc35619d5d60d6c0f7d0b482225ce6cf0f1b

See more details on using hashes here.

Provenance

The following attestation bundles were made for run_mcp_servers_with_aws_lambda-0.4.3-py3-none-any.whl:

Publisher: release.yml on awslabs/run-model-context-protocol-servers-with-aws-lambda

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page