Skip to main content

Orchestrate AI agents with ease.

Project description

WingMate

Orchestrate AI agents locally with ease.

WingMate is a lightweight, flexible Python framework for building and orchestrating AI agents. It is designed to run locally, with a tool definition system closely aligned with the Model Context Protocol (MCP), making it very simple to integrate MCP servers.

Features

  • Local-First: Designed to run agents in your local environment.
  • MCP Integration: Seamlessly integrate with MCP servers and tools.
  • Thoughtful Agents: Optional "thought" process visibility for debugging and transparency.
  • Highly Configurable: Easy configuration via YAML or environment variables.
  • Streaming: Built-in support for streaming agent responses.
  • Extensible: Create custom environments to control tool execution and context.

Installation

You can install Wingmate directly from GitHub:

pip install wingmate

Quick Start

Here is a simple example of how to create an agent that can perform date calculations using MCP tools.

1. Define Tools & Environment

Create a file named main.py:

import asyncio
from fastmcp import Client, FastMCP
from wingmate import DefaultEnvironment, Agent
from wingmate.types import BaseToolModel, CallToolRequestParams
from wingmate.utils import mcp_tools

# 1. Define tools using FastMCP
mcp = FastMCP()

@mcp.tool
def day_of_date(date: str) -> str:
    """Get the day of the week for a given date string in YYYY-MM-DD format."""
    import datetime
    dt = datetime.datetime.strptime(date, "%Y-%m-%d")
    return dt.strftime("%A")

# 2. Create an MCP client
client = Client(mcp)

# 3. Define the Environment
# The environment handles tool execution and context management
class SimpleEnvironment[T: BaseToolModel](DefaultEnvironment[T]):
    async def call_tool(self, action: CallToolRequestParams[T]) -> str | None:
        async with client:
            result = await client.call_tool(
                name=action.tool_name,
                arguments=action.arguments.model_dump()
            )
        return "\n".join(res.text for res in result.content if res.type == "text")

# 4. Initialize Agent
async def main():
    # Initialize environment with tools from the MCP client
    env = SimpleEnvironment(tools=mcp_tools(client))

    # Create the agent
    agent = Agent(
        environment=env,
        disable_thought=False  # Set to False to see the agent's thinking process
    )

    # Add a user message to the history
    env.history.add_message(role="user", content="What day was it on January 1st, 2000?")

    # Run the agent
    async for event in agent.stream():
        print(event.model_dump_json(indent=2))
        print("------------")

if __name__ == "__main__":
    asyncio.run(main())

2. Configure LLM

Create a wingmate-config.yaml file in your project root to configure your LLM provider (e.g., OpenAI or OpenAI-compatible):

llm_model_name: "gpt-4o"
llm_api_key: "your-api-key-here"
# Optional: Base URL for other compatible providers
# llm_base_url: "https://api.openai.com/v1"

3. Run

python main.py

Configuration

Wingmate uses pydantic-settings for configuration. You can configure it using a wingmate-config.yaml file, a .env file, or environment variables.

Setting Description Default
llm_model_name The name of the LLM model to use. None
llm_api_key API key for the LLM provider. None
llm_base_url Base URL for the LLM API. None
max_agent_iterations Maximum number of loops the agent can perform. 7
max_history_length Maximum number of messages to keep in history. 11
llm_api_extra_kw Extra keyword arguments for the LLM API call. {}

Advanced Usage

Custom Environments

The Environment class is the heart of Wingmate's extensibility. By subclassing DefaultEnvironment or implementing the Environment protocol, you can:

  • Customize Tool Execution: Handle tool calls locally, remotely, or via complex pipelines.
  • Manage Context: Control how history is stored, retrieved, and presented to the LLM.
  • Implement Termination Logic: Define custom criteria for when the agent should stop.

Thought Process

Wingmate can expose the agent's internal "thought" process. When disable_thought=False is passed to the Wingmate constructor, the agent will generate a thought trace before taking actions or answering. This is useful for debugging and understanding the agent's reasoning.

License

MIT

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

wingmate-0.3.3.tar.gz (12.4 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

wingmate-0.3.3-py3-none-any.whl (16.1 kB view details)

Uploaded Python 3

File details

Details for the file wingmate-0.3.3.tar.gz.

File metadata

  • Download URL: wingmate-0.3.3.tar.gz
  • Upload date:
  • Size: 12.4 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.13.7

File hashes

Hashes for wingmate-0.3.3.tar.gz
Algorithm Hash digest
SHA256 f8d07272ceb2b78d1d7d7a8d9548140ca9dfe2ea057ccb7fe18bdf7d330474bd
MD5 f256ed80065a83a30eec90b3b08f06e6
BLAKE2b-256 177bf9c541050c0d8efaae1ef554e04425240bf1e9e8cdbd6efc23c630122d7d

See more details on using hashes here.

Provenance

The following attestation bundles were made for wingmate-0.3.3.tar.gz:

Publisher: release.yml on pritam-dey3/WingMate

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file wingmate-0.3.3-py3-none-any.whl.

File metadata

  • Download URL: wingmate-0.3.3-py3-none-any.whl
  • Upload date:
  • Size: 16.1 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.13.7

File hashes

Hashes for wingmate-0.3.3-py3-none-any.whl
Algorithm Hash digest
SHA256 df626c36f7a35e772a0a1436bad445d18cd4b292c4aacaf919f24517a00a2a60
MD5 28031f05604a914937121264fab38ecc
BLAKE2b-256 40718a7c854022bbae7f118b32ae123308d1a9d8b0b5e18baaa2691abe5d7903

See more details on using hashes here.

Provenance

The following attestation bundles were made for wingmate-0.3.3-py3-none-any.whl:

Publisher: release.yml on pritam-dey3/WingMate

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page