Skip to main content

Fastmcp Agents project

Project description

Why teach every Agent how to use every tool? Why put the instructions on how to run git_clone into every Agent you write? Why do you have to keep telling it that it cant clone with depth: 0?

What if you could embed an Expert user of the tools available on the Server, into the Server?

Adding FastMCP Agents to your MCP Server

FastMCP Agents is a framework for building Agents into FastMCP Servers.

Instead of building an MCP server, exposing dozens or hundreds of generic tools, and then expecting your consumers to figure out how to use them, you can embed an optional AI Agent directly into your MCP Server that can take plain language asks from a user or another AI Agent and implement them leveraging the available tools:

web_agent = FastMCPAgent(
    name="Filesystem Agent",
    description="Assists with locating, categorizing, searching, reading, or writing files on the system.",
    default_instructions="""
    When you are asked to perform a task that requires you to interact with local files, 
    you should leverage the tools available to you to perform the task. If you are asked a
    question, you should leverage the tools available to you to answer the question.
    """,
    llm_link=AsyncLitellmLLMLink.from_model(
        model="vertex_ai/gemini-2.5-flash-preview-05-20",
    ),
)

web_agent.register_as_tools(server)

With full flexibility for you to dynamically constrain the embedded Agent based on information provided by the caller:

def ask_about_issue(ctx: Context, issue_number: int) -> str:
    """Ask about an issue in the repository."""
    
    def get_relevant_issue(issue_number: int) -> str:
        """Get the relevant issue from the repository."""
        return github.get_issue(issue_number)
    
    Tool.from_function(
        name="get_relevant_issue",
        description="Get the relevant issue from the repository.",
        function=get_relevant_issue,
    )

   return issue_agent.run(
        issue_number=issue_number,
        instructions="""
        You are an expert at triaging GitHub issues.
        """,
        tools=[get_relevant_issue],
    )

Adding FastMCP Agents to other people's MCP Servers

You can wrap any existing MCP Server and embed an AI Agent into the server, so that it can be used as a tool by other Agents. Combined with https://github.com/jlowin/fastmcp/pull/599 this enables entirely new ways of using MCP.

For example, you can take the upstream GitHub MCP Server, improve any tool's description, name, add safeguards, set default parameters on page_size, limit response sizes, etc and expose it as a new MCP Server.

In

third_party_mcp_config = {
    "time": {
        "command": "uvx",
        "args": [
            "git+https://github.com/modelcontextprotocol/servers.git@2025.4.24#subdirectory=src/time",
            "--local-timezone=America/New_York",
        ],
    }
}

override_config_yaml = ToolOverrides.from_yaml("""
tools:
  search_issues:
    description: >-
        An updated multi-line description 
        for the search_issues tool.
    parameter_overrides:
      query:
        description: The query to search for issues.
        default: "is:open"
""")


async def async_main():
    async with Client(third_party_mcp_config) as remote_mcp_client:
        proxied_mcp_server = FastMCP.as_proxy(remote_mcp_client)

        frontend_server = FastMCP("Frontend Server")

        def limit_response_size(response: str) -> str:
            """Limit the response size to 1000 characters."""
            raise ValueError("Response size is too large.")

        await transform_tools_from_server(
            proxied_mcp_server,
            frontend_server,
            overrides=override_config_yaml,
            post_call_hooks=[limit_response_size],
        )

        github_agent = FastMCPAgent(
            name="GitHub Agent",
            description="Assists with GitHub-related tasks like searching issues, PRs, and more.",
            default_instructions="""
            You are an expert at triaging GitHub issues...
            """,
            llm_link=AsyncLitellmLLMLink.from_model(
                model=os.env("FASTMCP_AGENTS_DEFAULT_MODEL"),
            ),
        )

        github_agent.register_as_tools(frontend_server)

        await frontend_server.run_async(transport="sse")

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

fastmcp_agents-0.1.0.tar.gz (162.3 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

fastmcp_agents-0.1.0-py3-none-any.whl (22.9 kB view details)

Uploaded Python 3

File details

Details for the file fastmcp_agents-0.1.0.tar.gz.

File metadata

  • Download URL: fastmcp_agents-0.1.0.tar.gz
  • Upload date:
  • Size: 162.3 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: uv/0.7.8

File hashes

Hashes for fastmcp_agents-0.1.0.tar.gz
Algorithm Hash digest
SHA256 8b0d799cfb77c5eaea44c8fe135a988390ba495e39973a237172cef2e5298b26
MD5 c50a624b25cc03d084fe993a79c28bfc
BLAKE2b-256 e385ef5552b1302d727d466294d10f769e58b159c67c02a0f14c3d4ca12fa2ff

See more details on using hashes here.

File details

Details for the file fastmcp_agents-0.1.0-py3-none-any.whl.

File metadata

File hashes

Hashes for fastmcp_agents-0.1.0-py3-none-any.whl
Algorithm Hash digest
SHA256 b802f83d854d259c0b1874f1aea2c1a3440f34fad9b58c3a297344cec7c0d168
MD5 dcb6eef4c9f576422cd36485c2095bd1
BLAKE2b-256 3685b5ee5fd4057fb18dea67ea5c3ffb730275a3181a39c5624d411fd4c34873

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page