Skip to main content

A lightweight, stateless multi-agent orchestration framework.

Project description

SwarmX (forked from OpenAI's Swarm)

PyPI version Python Version License: MIT Downloads GitHub stars GitHub forks GitHub issues Ruff

An extreme simple framework exploring ergonomic, lightweight multi-agent orchestration.

Highlights

  1. SwarmX is both Agent and Workflow
  2. MCP servers support
  3. OpenAI-compatible streaming-server
  4. Workflow import/export in JSON format

asciicast

Star History

Star History Chart

Quick start

SwarmX automatically loads environment variables from a .env file if present. You can either:

  1. Use a .env file (recommended):

    # Create a .env file in your project directory
    echo "OPENAI_API_KEY=your-api-key" > .env
    echo "OPENAI_BASE_URL=http://localhost:11434/v1" >> .env  # optional
    uvx swarmx  # Start interactive REPL
    
  2. Set environment variables manually:

    export OPENAI_API_KEY="your-api-key"
    # export OPENAI_BASE_URL="http://localhost:11434/v1"  # optional
    uvx swarmx  # Start interactive REPL
    

API Server

You can also start SwarmX as an OpenAI-compatible API server:

uvx swarmx serve --host 0.0.0.0 --port 8000

This provides OpenAI-compatible endpoints:

  • POST /v1/chat/completions - Chat completions with streaming support
  • GET /v1/models - List available models

Use it with any OpenAI-compatible client:

import openai

client = openai.OpenAI(
    base_url="http://localhost:8000/v1",
    api_key="dummy"  # SwarmX doesn't require authentication
)

response = client.chat.completions.create(
    model="gpt-4o",
    messages=[{"role": "user", "content": "Hello!"}]
)

Installation

Requires Python 3.11+

$ pip install swarmx # or `uv tool install swarmx`

Usage

import asyncio
from swarmx import Swarm, Agent

client = Swarm()

def transfer_to_agent_b():
    return agent_b


agent_a = Agent(
    name="Agent A",
    instructions="You are a helpful agent.",
    functions=[transfer_to_agent_b],
)

agent_b = Agent(
    name="Agent B",
    model="deepseek-r1:7b",
    instructions="你只能说中文。",  # You can only speak Chinese.
)


async def main():
    response = await client.run(
        agent=agent_a,
        messages=[{"role": "user", "content": "I want to talk to agent B."}],
    )

    print(response.messages[-1]["content"])


asyncio.run(main())

Context Variables

SwarmX supports special context variables that control agent behavior:

background

Provides additional context for the agent. This can be used for:

  • Adding external knowledge (web search results, database queries)
  • Compressing previous conversation history into summaries
  • Isolating context for sub-agents who don't need full conversation history

Example:

context = {
    "background": "Recent news: AI conference announced for next month. User is interested in AI developments."
}

message_slice

Controls which messages are sent to the LLM using Python slice syntax. This enables:

  • Context compression by sending only recent messages
  • LLM-driven filtering decisions
  • Memory management for long conversations

Slice patterns:

  • ":10" - First 10 messages
  • "-5:" - Last 5 messages
  • ":0" - No messages (useful with background for context compression)
  • "2:8" - Messages from index 2 to 7

Example:

context = {
    "message_slice": "-10:"  # Send only last 10 messages
}

tools

Dynamically selects which tools are available for the current completion. This allows:

  • Context-aware tool selection
  • Reducing tool overload by showing only relevant tools
  • Dynamic tool routing based on conversation context

Example:

context = {
    "tools": [
        {
            "type": "function",
            "function": {
                "name": "search_web",
                "description": "Search the web for information",
                "parameters": {
                    "type": "object",
                    "properties": {
                        "query": {"type": "string"}
                    },
                    "required": ["query"]
                }
            }
        },
        {
            "type": "function", 
            "function": {
                "name": "get_weather",
                "description": "Get weather information for a location",
                "parameters": {
                    "type": "object",
                    "properties": {
                        "location": {"type": "string"}
                    },
                    "required": ["location"]
                }
            }
        }
    ]
}

Advanced Usage Examples

Context Compression:

# Compress history into background and send no previous messages
context = {
    "background": "Previous conversation summary: User asked about weather, then travel plans.",
    "message_slice": ":0"  # No previous messages
}

RAG Pattern:

# Add web search results to background, send all messages for comprehensive context
context = {
    "background": "Web search results: Latest AI developments from yesterday's conference...",
    # No slice means all message might pass to LLM
}

Dynamic Tool Selection:

# Based on conversation topic, show only relevant tools
if "weather" in user_message:
    context = {"tools": [{"type": "function", "function": {"name": "get_weather", "description": "Get weather", "parameters": {"type": "object", "properties": {"location": {"type": "string"}}, "required": ["location"]}}}]}
elif "search" in user_message:
    context = {"tools": [{"type": "function", "function": {"name": "search_web", "description": "Search web", "parameters": {"type": "object", "properties": {"query": {"type": "string"}}, "required": ["query"]}}}]}

Architecture

graph TD
   classDef QA fill:#ffffff;
   classDef agent fill:#ffd8ac;
   classDef tool fill:#d3ecee;
   classDef result fill:#b4f2be;
   func1("transfer_to_weather_assistant()"):::tool
   Weather["Weather Assistant"]:::agent
   func2("get_weather('New York')"):::tool
   temp(64):::result
   A["It's 64 degrees in New York."]:::QA
   Q["What's the weather in ny?"]:::QA --> 
   Triage["Triage Agent"]:::agent --> Weather --> A
   Triage --> func1 --> Weather
   Weather --> func2 --> temp --> A

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

swarmx-0.7.0a0.tar.gz (21.1 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

swarmx-0.7.0a0-py3-none-any.whl (24.1 kB view details)

Uploaded Python 3

File details

Details for the file swarmx-0.7.0a0.tar.gz.

File metadata

  • Download URL: swarmx-0.7.0a0.tar.gz
  • Upload date:
  • Size: 21.1 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: uv/0.8.15

File hashes

Hashes for swarmx-0.7.0a0.tar.gz
Algorithm Hash digest
SHA256 182065c139e9523d8d8c421bc4c04ac2d524559e86071813f0626e1e6d16ae0b
MD5 0819fbb086c2b41587dfb364be611f13
BLAKE2b-256 b712cba43b41daa9dcca4ef3a279e822c43555d7bc065b0a89d63efee3e92a3c

See more details on using hashes here.

File details

Details for the file swarmx-0.7.0a0-py3-none-any.whl.

File metadata

  • Download URL: swarmx-0.7.0a0-py3-none-any.whl
  • Upload date:
  • Size: 24.1 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: uv/0.8.15

File hashes

Hashes for swarmx-0.7.0a0-py3-none-any.whl
Algorithm Hash digest
SHA256 a899187e8a577a00e1d71ae0d0ce193805e67da91b4e8a903c1c2d4a8087bbcc
MD5 e4ad462febc3e09d7780891fe1138beb
BLAKE2b-256 066986597b2157cb4b447b27cd437e208cbe3b97c3b1908a12f92b0fda01e47e

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page