Skip to main content

MCP (Model Context Protocol) extension for OpenAI Agents SDK, built using mcp-agent

Project description

OpenAI Agents SDK - MCP Extension

This package extends the OpenAI Agents SDK to add support for Model Context Protocol (MCP) servers. With this extension, you can seamlessly use MCP servers and their tools with the OpenAI Agents SDK.

The project is built using the mcp-agent library.

discord Pepy Total Downloads

Features

  • Connect OpenAI Agents to MCP servers
  • Access tools from MCP servers alongside native OpenAI Agent SDK tools
  • Configure MCP servers via standard configuration files
  • Automatic tool discovery and conversion from MCP to Agent SDK format

Installation

uv add openai-agents-mcp
pip install openai-agents-mcp

Quick Start

[!TIP] The examples directory has several example applications to get started with. To run an example, clone this repo, then:

cd examples
cp mcp_agent.secrets.yaml.example mcp_agent.secrets.yaml # Update API keys if needed
uv run hello_world_mcp.py # Or any other example

In order to use Agents SDK with MCP, simply replace the following import:

- from agents import Agent
+ from agents_mcp import Agent

With that you can instantiate an Agent with mcp_servers in addition to tools (which continue to work like before).

    from agents_mcp import Agent

    # Create an agent with specific MCP servers you want to use
    # These must be defined in your mcp_agent.config.yaml file
    agent = Agent(
        name="MCP Agent",
        instructions="""You are a helpful assistant with access to both local/OpenAI tools and tools from MCP servers. Use these tools to help the user.""",
        # Local/OpenAI tools
        tools=[get_current_weather],
        # Specify which MCP servers to use
        # These must be defined in your mcp_agent config
        mcp_servers=["fetch", "filesystem"],
    )

Then define an mcp_agent.config.yaml, with the MCP server configuration:

mcp:
  servers:
    fetch:
      command: npx
      args: ["-y", "@modelcontextprotocol/server-fetch"]
    filesystem:
      command: npx
      args: ["-y", "@modelcontextprotocol/server-filesystem", "."]

That's it! The rest of the Agents SDK works exactly as before.

Head over to the examples directory to see MCP servers in action with Agents SDK.

Demo

https://github.com/user-attachments/assets/1d2a843d-2f99-41f2-8671-4c7940ec48f5

More details and nuances below.

Using MCP servers in Agents SDK

mcp_servers property on Agent

You can specify the names of MCP servers to give an Agent access to by setting its mcp_servers property.

The Agent will then automatically aggregate tools from the servers, as well as any tools specified, and create a single extended list of tools. This means you can seamlessly use local tools, MCP servers, and other kinds of Agent SDK tools through a single unified syntax.

agent = Agent(
    name="MCP Assistant",
    instructions="You are a helpful assistant with access to MCP tools.",
    tools=[your_other_tools], # Regular tool use for Agent SDK
    mcp_servers=["fetch", "filesystem"]  # Names of MCP servers from your config file (see below)
)

MCP Configuration File

Configure MCP servers by creating an mcp_agent.config.yaml file. You can place this file in your project directory or any parent directory.

Here's an example configuration file that defines three MCP servers:

$schema: "https://raw.githubusercontent.com/lastmile-ai/mcp-agent/main/schema/mcp-agent.config.schema.json"

mcp:
  servers:
    fetch:
      command: "uvx"
      args: ["mcp-server-fetch"]
    filesystem:
      command: "npx"
      args: ["-y", "@modelcontextprotocol/server-filesystem", "."]
    slack:
      command: "npx"
      args: ["-y", "@modelcontextprotocol/server-slack"]

For servers that require sensitive information like API keys, you can:

  1. Define them directly in the config file (not recommended for production)
  2. Use a separate mcp_agent.secrets.yaml file (more secure)
  3. Set them as environment variables

Methods for Configuring MCP

This extension supports several ways to configure MCP servers:

1. Automatic Discovery (Recommended)

The simplest approach lets the SDK automatically find your configuration file if it's named mcp_agent.config.yaml and mcp_agent.secrets.yaml:

from agents_mcp import Agent, RunnerContext

# Create an agent that references MCP servers
agent = Agent(
    name="MCP Assistant",
    instructions="You are a helpful assistant with access to MCP tools.",
    mcp_servers=["fetch", "filesystem"]  # Names of servers from your config file
)

result = await Runner.run(agent, input="Hello world", context=RunnerContext())

2. Explicit Config Path

You can explicitly specify the path to your config file:

from agents_mcp import RunnerContext

context = RunnerContext(mcp_config_path="/path/to/mcp_agent.config.yaml")

3. Programmatic Configuration

You can programmatically define your MCP settings:

from mcp_agent.config import MCPSettings, MCPServerSettings
from agents_mcp import RunnerContext

# Define MCP config programmatically
mcp_config = MCPSettings(
    servers={
        "fetch": MCPServerSettings(
            command="uvx",
            args=["mcp-server-fetch"]
        ),
        "filesystem": MCPServerSettings(
            command="npx",
            args=["-y", "@modelcontextprotocol/server-filesystem", "."]
        )
    }
)

context = RunnerContext(mcp_config=mcp_config)

4. Custom Server Registry

You can create and configure your own MCP server registry:

from mcp_agent.mcp_server_registry import ServerRegistry
from mcp_agent.config import get_settings

from agents_mcp import Agent

# Create a custom server registry
settings = get_settings("/path/to/config.yaml")
server_registry = ServerRegistry(config=settings)

# Create an agent with this registry
agent = Agent(
    name="Custom Registry Agent",
    instructions="You have access to custom MCP servers.",
    mcp_servers=["fetch", "filesystem"],
    mcp_server_registry=server_registry  # Use custom registry
)

Examples

Basic Hello World

A simple example demonstrating how to create an agent that uses MCP tools:

from agents_mcp import Agent, RunnerContext

# Create an agent with MCP servers
agent = Agent(
    name="MCP Assistant",
    instructions="You are a helpful assistant with access to tools.",
    tools=[get_current_weather],  # Local tools
    mcp_servers=["fetch", "filesystem"],  # MCP servers
)

# Run the agent
result = await Runner.run(
    agent,
    input="What's the weather in Miami? Also, can you fetch the OpenAI website?",
    context=RunnerContext(),
)

print(result.response.value)

See hello_world_mcp.py for the complete example.

Streaming Responses

To stream responses instead of waiting for the complete result:

result = Runner.run_streamed(  # Note: No await here
    agent,
    input="Print the first paragraph of https://openai.github.io/openai-agents-python/",
    context=context,
)

# Stream the events
async for event in result.stream_events():
    if event.type == "raw_response_event" and isinstance(event.data, ResponseTextDeltaEvent):
        print(event.data.delta, end="", flush=True)

See hello_world_mcp_streamed.py for the complete example.

Acknowledgements

This project is made possible thanks to the following projects:

License

MIT

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

openai_agents_mcp-0.0.8.tar.gz (131.1 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

openai_agents_mcp-0.0.8-py3-none-any.whl (14.9 kB view details)

Uploaded Python 3

File details

Details for the file openai_agents_mcp-0.0.8.tar.gz.

File metadata

  • Download URL: openai_agents_mcp-0.0.8.tar.gz
  • Upload date:
  • Size: 131.1 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.12.9

File hashes

Hashes for openai_agents_mcp-0.0.8.tar.gz
Algorithm Hash digest
SHA256 1b108dc9a612a3b1195390bf95ff37bbc9d63479543bfb915e0c13d9cb1b93e4
MD5 f33b375c350c591eade6ce8502e5e160
BLAKE2b-256 f211155573e7622aab5bf0db4e1b65019d69e5182351ad3db930f9040d028114

See more details on using hashes here.

Provenance

The following attestation bundles were made for openai_agents_mcp-0.0.8.tar.gz:

Publisher: publish.yml on lastmile-ai/openai-agents-mcp

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file openai_agents_mcp-0.0.8-py3-none-any.whl.

File metadata

File hashes

Hashes for openai_agents_mcp-0.0.8-py3-none-any.whl
Algorithm Hash digest
SHA256 3bdcce1819f4040e6451e6837f21df8e8a54504c0e055a886e17b383d3328124
MD5 93c530badf0eea47ed07a2c0bcd0fe4b
BLAKE2b-256 9a0b60e3f86a3802c4235f8fd484de3083c92b04cc667660bf844b44c8a729b6

See more details on using hashes here.

Provenance

The following attestation bundles were made for openai_agents_mcp-0.0.8-py3-none-any.whl:

Publisher: publish.yml on lastmile-ai/openai-agents-mcp

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page