MCP (Model Context Protocol) extension for OpenAI Agents SDK, built using mcp-agent
Project description
OpenAI Agents SDK - MCP Extension
This package extends the OpenAI Agents SDK to add support for Model Context Protocol (MCP) servers. With this extension, you can seamlessly use MCP servers and their tools with the OpenAI Agents SDK.
The project is built using the mcp-agent library.
Features
- Connect OpenAI Agents to MCP servers
- Access tools from MCP servers alongside native OpenAI Agent SDK tools
- Configure MCP servers via standard configuration files
- Automatic tool discovery and conversion from MCP to Agent SDK format
Installation
uv add openai-agents-mcp
pip install openai-agents-mcp
Quick Start
[!TIP] The
examplesdirectory has several example applications to get started with. To run an example, clone this repo, then:cd examples cp mcp_agent.secrets.yaml.example mcp_agent.secrets.yaml # Update API keys if needed uv run hello_world_mcp.py # Or any other example
In order to use Agents SDK with MCP, simply replace the following import:
- from agents import Agent
+ from agents_mcp import Agent
With that you can instantiate an Agent with mcp_servers in addition to tools (which continue to work like before).
from agents_mcp import Agent
# Create an agent with specific MCP servers you want to use
# These must be defined in your mcp_agent.config.yaml file
agent = Agent(
name="MCP Agent",
instructions="""You are a helpful assistant with access to both local/OpenAI tools and tools from MCP servers. Use these tools to help the user.""",
# Local/OpenAI tools
tools=[get_current_weather],
# Specify which MCP servers to use
# These must be defined in your mcp_agent config
mcp_servers=["fetch", "filesystem"],
)
Then define an mcp_agent.config.yaml, with the MCP server configuration:
mcp:
servers:
fetch:
command: npx
args: ["-y", "@modelcontextprotocol/server-fetch"]
filesystem:
command: npx
args: ["-y", "@modelcontextprotocol/server-filesystem", "."]
That's it! The rest of the Agents SDK works exactly as before.
Head over to the examples directory to see MCP servers in action with Agents SDK.
Demo
https://github.com/user-attachments/assets/1d2a843d-2f99-41f2-8671-4c7940ec48f5
More details and nuances below.
Using MCP servers in Agents SDK
mcp_servers property on Agent
You can specify the names of MCP servers to give an Agent access to by
setting its mcp_servers property.
The Agent will then automatically aggregate tools from the servers, as well as
any tools specified, and create a single extended list of tools. This means you can seamlessly
use local tools, MCP servers, and other kinds of Agent SDK tools through a single unified syntax.
agent = Agent(
name="MCP Assistant",
instructions="You are a helpful assistant with access to MCP tools.",
tools=[your_other_tools], # Regular tool use for Agent SDK
mcp_servers=["fetch", "filesystem"] # Names of MCP servers from your config file (see below)
)
MCP Configuration File
Configure MCP servers by creating an mcp_agent.config.yaml file. You can place this file in your project directory or any parent directory.
Here's an example configuration file that defines three MCP servers:
$schema: "https://raw.githubusercontent.com/lastmile-ai/mcp-agent/main/schema/mcp-agent.config.schema.json"
mcp:
servers:
fetch:
command: "uvx"
args: ["mcp-server-fetch"]
filesystem:
command: "npx"
args: ["-y", "@modelcontextprotocol/server-filesystem", "."]
slack:
command: "npx"
args: ["-y", "@modelcontextprotocol/server-slack"]
For servers that require sensitive information like API keys, you can:
- Define them directly in the config file (not recommended for production)
- Use a separate
mcp_agent.secrets.yamlfile (more secure) - Set them as environment variables
Methods for Configuring MCP
This extension supports several ways to configure MCP servers:
1. Automatic Discovery (Recommended)
The simplest approach lets the SDK automatically find your configuration file if it's named mcp_agent.config.yaml and mcp_agent.secrets.yaml:
from agents_mcp import Agent, RunnerContext
# Create an agent that references MCP servers
agent = Agent(
name="MCP Assistant",
instructions="You are a helpful assistant with access to MCP tools.",
mcp_servers=["fetch", "filesystem"] # Names of servers from your config file
)
result = await Runner.run(agent, input="Hello world", context=RunnerContext())
2. Explicit Config Path
You can explicitly specify the path to your config file:
from agents_mcp import RunnerContext
context = RunnerContext(mcp_config_path="/path/to/mcp_agent.config.yaml")
3. Programmatic Configuration
You can programmatically define your MCP settings:
from mcp_agent.config import MCPSettings, MCPServerSettings
from agents_mcp import RunnerContext
# Define MCP config programmatically
mcp_config = MCPSettings(
servers={
"fetch": MCPServerSettings(
command="uvx",
args=["mcp-server-fetch"]
),
"filesystem": MCPServerSettings(
command="npx",
args=["-y", "@modelcontextprotocol/server-filesystem", "."]
)
}
)
context = RunnerContext(mcp_config=mcp_config)
4. Custom Server Registry
You can create and configure your own MCP server registry:
from mcp_agent.mcp_server_registry import ServerRegistry
from mcp_agent.config import get_settings
from agents_mcp import Agent
# Create a custom server registry
settings = get_settings("/path/to/config.yaml")
server_registry = ServerRegistry(config=settings)
# Create an agent with this registry
agent = Agent(
name="Custom Registry Agent",
instructions="You have access to custom MCP servers.",
mcp_servers=["fetch", "filesystem"],
mcp_server_registry=server_registry # Use custom registry
)
Examples
Basic Hello World
A simple example demonstrating how to create an agent that uses MCP tools:
from agents_mcp import Agent, RunnerContext
# Create an agent with MCP servers
agent = Agent(
name="MCP Assistant",
instructions="You are a helpful assistant with access to tools.",
tools=[get_current_weather], # Local tools
mcp_servers=["fetch", "filesystem"], # MCP servers
)
# Run the agent
result = await Runner.run(
agent,
input="What's the weather in Miami? Also, can you fetch the OpenAI website?",
context=RunnerContext(),
)
print(result.response.value)
See hello_world_mcp.py for the complete example.
Streaming Responses
To stream responses instead of waiting for the complete result:
result = Runner.run_streamed( # Note: No await here
agent,
input="Print the first paragraph of https://openai.github.io/openai-agents-python/",
context=context,
)
# Stream the events
async for event in result.stream_events():
if event.type == "raw_response_event" and isinstance(event.data, ResponseTextDeltaEvent):
print(event.data.delta, end="", flush=True)
See hello_world_mcp_streamed.py for the complete example.
Acknowledgements
This project is made possible thanks to the following projects:
License
MIT
Project details
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file openai_agents_mcp-0.0.8.tar.gz.
File metadata
- Download URL: openai_agents_mcp-0.0.8.tar.gz
- Upload date:
- Size: 131.1 kB
- Tags: Source
- Uploaded using Trusted Publishing? Yes
- Uploaded via: twine/6.1.0 CPython/3.12.9
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
1b108dc9a612a3b1195390bf95ff37bbc9d63479543bfb915e0c13d9cb1b93e4
|
|
| MD5 |
f33b375c350c591eade6ce8502e5e160
|
|
| BLAKE2b-256 |
f211155573e7622aab5bf0db4e1b65019d69e5182351ad3db930f9040d028114
|
Provenance
The following attestation bundles were made for openai_agents_mcp-0.0.8.tar.gz:
Publisher:
publish.yml on lastmile-ai/openai-agents-mcp
-
Statement:
-
Statement type:
https://in-toto.io/Statement/v1 -
Predicate type:
https://docs.pypi.org/attestations/publish/v1 -
Subject name:
openai_agents_mcp-0.0.8.tar.gz -
Subject digest:
1b108dc9a612a3b1195390bf95ff37bbc9d63479543bfb915e0c13d9cb1b93e4 - Sigstore transparency entry: 188043025
- Sigstore integration time:
-
Permalink:
lastmile-ai/openai-agents-mcp@8d73ef658dbe04a39b28680017183b80a66a77b0 -
Branch / Tag:
refs/tags/v0.0.8 - Owner: https://github.com/lastmile-ai
-
Access:
public
-
Token Issuer:
https://token.actions.githubusercontent.com -
Runner Environment:
github-hosted -
Publication workflow:
publish.yml@8d73ef658dbe04a39b28680017183b80a66a77b0 -
Trigger Event:
release
-
Statement type:
File details
Details for the file openai_agents_mcp-0.0.8-py3-none-any.whl.
File metadata
- Download URL: openai_agents_mcp-0.0.8-py3-none-any.whl
- Upload date:
- Size: 14.9 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? Yes
- Uploaded via: twine/6.1.0 CPython/3.12.9
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
3bdcce1819f4040e6451e6837f21df8e8a54504c0e055a886e17b383d3328124
|
|
| MD5 |
93c530badf0eea47ed07a2c0bcd0fe4b
|
|
| BLAKE2b-256 |
9a0b60e3f86a3802c4235f8fd484de3083c92b04cc667660bf844b44c8a729b6
|
Provenance
The following attestation bundles were made for openai_agents_mcp-0.0.8-py3-none-any.whl:
Publisher:
publish.yml on lastmile-ai/openai-agents-mcp
-
Statement:
-
Statement type:
https://in-toto.io/Statement/v1 -
Predicate type:
https://docs.pypi.org/attestations/publish/v1 -
Subject name:
openai_agents_mcp-0.0.8-py3-none-any.whl -
Subject digest:
3bdcce1819f4040e6451e6837f21df8e8a54504c0e055a886e17b383d3328124 - Sigstore transparency entry: 188043031
- Sigstore integration time:
-
Permalink:
lastmile-ai/openai-agents-mcp@8d73ef658dbe04a39b28680017183b80a66a77b0 -
Branch / Tag:
refs/tags/v0.0.8 - Owner: https://github.com/lastmile-ai
-
Access:
public
-
Token Issuer:
https://token.actions.githubusercontent.com -
Runner Environment:
github-hosted -
Publication workflow:
publish.yml@8d73ef658dbe04a39b28680017183b80a66a77b0 -
Trigger Event:
release
-
Statement type: