Skip to main content

Headless framework for multi-agent chat runtime and evaluation.

Project description

SwarmForge

SwarmForge is a Python package for authoring, running, and evaluating multi-agent swarms. You define the swarm graph, provide the model-turn callback, and keep sessions, handoffs, tools, checkpoints, and evaluation artifacts under your control.

  • Explicit graph-based multi-agent runtime
  • Python tool execution with inferred JSON schema
  • OpenRouter, Gemini, and other OpenAI-compatible provider support
  • FastAPI transport for stateless and session-backed HTTP flows
  • Evaluation helpers for graph snapshots, scenario seeds, and artifact scoring

Install

Python 3.11+ is required.

pip install swarmforge

If you want the FastAPI transport too:

pip install "swarmforge[api]"

Provider-backed examples, the demo UI, and local API runs load a nearby .env automatically. Copy .env.example to .env, then set MODEL_PROVIDER, LLM_MODEL, and the matching API key before using any runnable example:

cp .env.example .env

Quick Start

The shortest path is a single-node swarm with a real provider-backed turn runner:

import asyncio
import json

from swarmforge.env import require_env_vars
from swarmforge.evaluation.provider import ModelConfig, OpenAIClientWrapper
from swarmforge.swarm import (
    AgentTurnConfig,
    AgentTurnResult,
    InMemorySessionStore,
    SwarmDefinition,
    SwarmNode,
    SwarmSession,
    process_swarm_stream,
)


class ProviderBackedTurnRunner:
    def __init__(self) -> None:
        self.client = OpenAIClientWrapper(ModelConfig())

    async def run_turn(self, *, agent_node, contents, config: AgentTurnConfig):
        del agent_node
        messages = []
        if config.system_instruction:
            messages.append({"role": "system", "content": config.system_instruction})

        for item in contents:
            role = str(item.get("role") or "user").strip()
            if role == "model":
                role = "assistant"
            message = {"role": role, "content": item.get("content", "")}
            if role == "tool":
                if item.get("tool_call_id"):
                    message["tool_call_id"] = item.get("tool_call_id")
                if item.get("name"):
                    message["name"] = item.get("name")
            messages.append(message)

        response = self.client.chat_completion(messages=messages, tools=config.tools)
        assistant_message = response.choices[0].message
        tool_calls = []
        if assistant_message.tool_calls:
            for tool_call in assistant_message.tool_calls:
                args = tool_call.function.arguments
                if not isinstance(args, dict):
                    args = json.loads(args or "{}")
                tool_calls.append(
                    {
                        "id": tool_call.id,
                        "name": tool_call.function.name,
                        "args": args,
                    }
                )

        return AgentTurnResult(
            response_text=assistant_message.content or "",
            tool_calls=tool_calls,
            raw_response=response,
        )


swarm = SwarmDefinition(
    id="assistant",
    name="Assistant Swarm",
    nodes=[
        SwarmNode(
            id="assistant",
            node_key="assistant",
            name="Assistant",
            intent="Handle general requests",
            system_prompt="You are a concise assistant.",
            capabilities=["Answer questions"],
            is_entry_node=True,
        )
    ],
)


async def main():
    require_env_vars("MODEL_PROVIDER", "LLM_MODEL")
    session = SwarmSession(id="session-1", swarm=swarm)
    store = InMemorySessionStore()
    async for event in process_swarm_stream(
        session,
        "Give me a concise summary.",
        store=store,
        turn_runner=ProviderBackedTurnRunner(),
    ):
        print(json.dumps(event, indent=2))


if __name__ == "__main__":
    asyncio.run(main())

The final done event contains the real model output, so the wording varies by provider and model. When you are ready to add routing, continue with the multi-agent flow in the docs.

Package Surfaces

  • swarmforge.swarm Runtime models, session state, orchestration, tool execution, and stores.
  • swarmforge.authoring Prompt templates, payload validation, and graph compilation helpers.
  • swarmforge.evaluation Graph snapshots, scenario generation, feasibility checks, and artifact scoring.
  • swarmforge.api FastAPI application factory built on the same runtime primitives.

Providers

SwarmForge ships with an OpenAI-compatible provider wrapper. OpenRouter is the default path, and Gemini is built in as an alternative mode.

Start from the repository .env.example and explicitly set both the provider and the model you want to use.

OpenRouter .env:

MODEL_PROVIDER=openrouter
LLM_MODEL=openrouter/auto
OPENROUTER_API_KEY=sk-or-...
OPENROUTER_SITE_URL=https://your-app.example
OPENROUTER_APP_NAME="Your App Name"

Gemini .env:

MODEL_PROVIDER=gemini
LLM_MODEL=gemini-3-flash-preview
GEMINI_API_KEY=...

Minimal client setup:

from swarmforge.evaluation.provider import ModelConfig, OpenAIClientWrapper

client = OpenAIClientWrapper(ModelConfig())

ModelConfig() reads MODEL_PROVIDER, LLM_MODEL, and the matching API key from .env or the shell environment.

FastAPI Transport

You can expose the runtime over HTTP without changing your swarm definitions:

pip install "swarmforge[api]"
uvicorn swarmforge.api.fastapi:create_fastapi_app --factory --reload

That app exposes both stateless run endpoints and session-backed endpoints with SSE streaming.

Documentation

Source Examples

The repository includes end-to-end example scripts under examples/. Those scripts are useful when you want runnable reference flows for authoring, orchestration, evaluation, provider integration, or FastAPI transport. Provider-backed examples and the local FastAPI example read from .env.example-style settings.

The demo UI under demo-ui/ reads the same root .env for its default API base, provider, and model. Its Vite scripts create .env from .env.example automatically when the file is missing.

Contributing

Core modification, docs development, demo UI work, and PyPI release steps are documented in CONTRIBUTING.md.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

swarmforge-0.7.0.tar.gz (63.1 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

swarmforge-0.7.0-py3-none-any.whl (57.0 kB view details)

Uploaded Python 3

File details

Details for the file swarmforge-0.7.0.tar.gz.

File metadata

  • Download URL: swarmforge-0.7.0.tar.gz
  • Upload date:
  • Size: 63.1 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.13.12

File hashes

Hashes for swarmforge-0.7.0.tar.gz
Algorithm Hash digest
SHA256 a29c8da19d6c579fa1078712dc786acc6d9d61c328e1f9f5492cab08ddc61306
MD5 16d026b3b62ae52b4faf10ebd33f37a8
BLAKE2b-256 e839af73665f201458de0ec59584f4ec69e42189a1dd97c61738059814ae0db8

See more details on using hashes here.

Provenance

The following attestation bundles were made for swarmforge-0.7.0.tar.gz:

Publisher: release.yml on Rvey/swarm-forge

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file swarmforge-0.7.0-py3-none-any.whl.

File metadata

  • Download URL: swarmforge-0.7.0-py3-none-any.whl
  • Upload date:
  • Size: 57.0 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.13.12

File hashes

Hashes for swarmforge-0.7.0-py3-none-any.whl
Algorithm Hash digest
SHA256 1a48624ce5aa60ce6bb5086b163c2ef9e1f076447a0639db6a076541d2379f51
MD5 88b7dcfb6adde3733b97f159815fc830
BLAKE2b-256 d559dc382cc4a581b1d47abdae43ffac1c145f4762b071dc4a803305926ff71f

See more details on using hashes here.

Provenance

The following attestation bundles were made for swarmforge-0.7.0-py3-none-any.whl:

Publisher: release.yml on Rvey/swarm-forge

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page