Skip to main content

Orchestrate AI agents with ease.

Project description

WingMate

Orchestrate AI agents locally with ease.

WingMate is a lightweight, flexible Python framework for building and orchestrating AI agents. It is designed to run locally, with a tool definition system closely aligned with the Model Context Protocol (MCP), making it very simple to integrate MCP servers.

Features

  • Local-First: Designed to run agents in your local environment.
  • MCP Integration: Seamlessly integrate with MCP servers and tools.
  • Thoughtful Agents: Optional "thought" process visibility for debugging and transparency.
  • Highly Configurable: Easy configuration via YAML or environment variables.
  • Streaming: Built-in support for streaming agent responses.
  • Extensible: Create custom environments to control tool execution and context.

Installation

You can install Wingmate directly from GitHub:

pip install wingmate

Quick Start

Here is a simple example of how to create an agent that can perform date calculations using MCP tools.

1. Define Tools & Environment

Create a file named main.py:

import asyncio
from fastmcp import Client, FastMCP
from wingmate import DefaultEnvironment, Agent
from wingmate.types import BaseToolModel, CallToolRequestParams
from wingmate.utils import mcp_tools

# 1. Define tools using FastMCP
mcp = FastMCP()

@mcp.tool
def day_of_date(date: str) -> str:
    """Get the day of the week for a given date string in YYYY-MM-DD format."""
    import datetime
    dt = datetime.datetime.strptime(date, "%Y-%m-%d")
    return dt.strftime("%A")

# 2. Create an MCP client
client = Client(mcp)

# 3. Define the Environment
# The environment handles tool execution and context management
class SimpleEnvironment[T: BaseToolModel](DefaultEnvironment[T]):
    async def call_tool(self, action: CallToolRequestParams[T]) -> str | None:
        async with client:
            result = await client.call_tool(
                name=action.tool_name,
                arguments=action.arguments.model_dump()
            )
        return "\n".join(res.text for res in result.content if res.type == "text")

# 4. Initialize Agent
async def main():
    # Initialize environment with tools from the MCP client
    env = SimpleEnvironment(tools=mcp_tools(client))

    # Create the agent
    agent = Agent(
        environment=env,
        disable_thought=False  # Set to False to see the agent's thinking process
    )

    # Add a user message to the history
    env.history.add_message(role="user", content="What day was it on January 1st, 2000?")

    # Run the agent
    async for event in agent.stream():
        print(event.model_dump_json(indent=2))
        print("------------")

if __name__ == "__main__":
    asyncio.run(main())

2. Configure LLM

Create a wingmate-config.yaml file in your project root to configure your LLM provider (e.g., OpenAI or OpenAI-compatible):

llm_model_name: "gpt-4o"
llm_api_key: "your-api-key-here"
# Optional: Base URL for other compatible providers
# llm_base_url: "https://api.openai.com/v1"

3. Run

python main.py

Configuration

Wingmate uses pydantic-settings for configuration. You can configure it using a wingmate-config.yaml file, a .env file, or environment variables.

Setting Description Default
llm_model_name The name of the LLM model to use. None
llm_api_key API key for the LLM provider. None
llm_base_url Base URL for the LLM API. None
max_agent_iterations Maximum number of loops the agent can perform. 7
max_history_length Maximum number of messages to keep in history. 11
llm_api_extra_kw Extra keyword arguments for the LLM API call. {}

Advanced Usage

Custom Environments

The Environment class is the heart of Wingmate's extensibility. By subclassing DefaultEnvironment or implementing the Environment protocol, you can:

  • Customize Tool Execution: Handle tool calls locally, remotely, or via complex pipelines.
  • Manage Context: Control how history is stored, retrieved, and presented to the LLM.
  • Implement Termination Logic: Define custom criteria for when the agent should stop.

Thought Process

Wingmate can expose the agent's internal "thought" process. When disable_thought=False is passed to the Wingmate constructor, the agent will generate a thought trace before taking actions or answering. This is useful for debugging and understanding the agent's reasoning.

License

MIT

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

wingmate-0.3.2.tar.gz (12.3 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

wingmate-0.3.2-py3-none-any.whl (15.9 kB view details)

Uploaded Python 3

File details

Details for the file wingmate-0.3.2.tar.gz.

File metadata

  • Download URL: wingmate-0.3.2.tar.gz
  • Upload date:
  • Size: 12.3 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.13.7

File hashes

Hashes for wingmate-0.3.2.tar.gz
Algorithm Hash digest
SHA256 342636cb7e02235a99d6fedfb65cde98dc57d11668f5e786c5b6a9046922bc06
MD5 3cb1bc59d4698a31c042a1d23a50abfa
BLAKE2b-256 f7d100c9a182f9e4a3df28165f0a642fc756d9d1226010166aafbb7bc90a9185

See more details on using hashes here.

Provenance

The following attestation bundles were made for wingmate-0.3.2.tar.gz:

Publisher: release.yml on pritam-dey3/WingMate

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file wingmate-0.3.2-py3-none-any.whl.

File metadata

  • Download URL: wingmate-0.3.2-py3-none-any.whl
  • Upload date:
  • Size: 15.9 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.13.7

File hashes

Hashes for wingmate-0.3.2-py3-none-any.whl
Algorithm Hash digest
SHA256 8066866a04567b139ae01f4648f38adbdd4f078bce75a53de13af9738890aeed
MD5 509b39c63ee715d66baf804547def4d5
BLAKE2b-256 eb7e73a47a96d17feefc79c78e71698d70a86ddc9a8803e52e19c2a6e3938d79

See more details on using hashes here.

Provenance

The following attestation bundles were made for wingmate-0.3.2-py3-none-any.whl:

Publisher: release.yml on pritam-dey3/WingMate

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page