Skip to main content

Headless framework for multi-agent chat runtime and evaluation.

Project description

SwarmForge

SwarmForge is a Python package for authoring, running, and evaluating multi-agent swarms. You define the swarm graph, provide the model-turn callback, and keep sessions, handoffs, tools, checkpoints, and evaluation artifacts under your control.

  • Explicit graph-based multi-agent runtime
  • Python tool execution with inferred JSON schema
  • OpenRouter, Gemini, and other OpenAI-compatible provider support
  • FastAPI transport for stateless and session-backed HTTP flows
  • Evaluation helpers for graph snapshots, scenario seeds, and artifact scoring

Install

Python 3.11+ is required.

pip install swarmforge

If you want the FastAPI transport too:

pip install "swarmforge[api]"

Provider-backed examples, the demo UI, and local API runs load a nearby .env automatically. Copy .env.example to .env, then set MODEL_PROVIDER, LLM_MODEL, and the matching API key before using any runnable example:

cp .env.example .env

Quick Start

The shortest path is a single-node swarm with a real provider-backed turn runner:

import asyncio
import json

from swarmforge.env import require_env_vars
from swarmforge.evaluation.provider import ModelConfig, OpenAIClientWrapper
from swarmforge.swarm import (
    AgentTurnConfig,
    AgentTurnResult,
    InMemorySessionStore,
    SwarmDefinition,
    SwarmNode,
    SwarmSession,
    process_swarm_stream,
)


class ProviderBackedTurnRunner:
    def __init__(self) -> None:
        self.client = OpenAIClientWrapper(ModelConfig())

    async def run_turn(self, *, agent_node, contents, config: AgentTurnConfig):
        del agent_node
        messages = []
        if config.system_instruction:
            messages.append({"role": "system", "content": config.system_instruction})

        for item in contents:
            role = str(item.get("role") or "user").strip()
            if role == "model":
                role = "assistant"
            message = {"role": role, "content": item.get("content", "")}
            if role == "tool":
                if item.get("tool_call_id"):
                    message["tool_call_id"] = item.get("tool_call_id")
                if item.get("name"):
                    message["name"] = item.get("name")
            messages.append(message)

        response = self.client.chat_completion(messages=messages, tools=config.tools)
        assistant_message = response.choices[0].message
        tool_calls = []
        if assistant_message.tool_calls:
            for tool_call in assistant_message.tool_calls:
                args = tool_call.function.arguments
                if not isinstance(args, dict):
                    args = json.loads(args or "{}")
                tool_calls.append(
                    {
                        "id": tool_call.id,
                        "name": tool_call.function.name,
                        "args": args,
                    }
                )

        return AgentTurnResult(
            response_text=assistant_message.content or "",
            tool_calls=tool_calls,
            raw_response=response,
        )


swarm = SwarmDefinition(
    id="assistant",
    name="Assistant Swarm",
    nodes=[
        SwarmNode(
            id="assistant",
            node_key="assistant",
            name="Assistant",
            intent="Handle general requests",
            system_prompt="You are a concise assistant.",
            capabilities=["Answer questions"],
            is_entry_node=True,
        )
    ],
)


async def main():
    require_env_vars("MODEL_PROVIDER", "LLM_MODEL")
    session = SwarmSession(id="session-1", swarm=swarm)
    store = InMemorySessionStore()
    async for event in process_swarm_stream(
        session,
        "Give me a concise summary.",
        store=store,
        turn_runner=ProviderBackedTurnRunner(),
    ):
        print(json.dumps(event, indent=2))


if __name__ == "__main__":
    asyncio.run(main())

The final done event contains the real model output, so the wording varies by provider and model. When you are ready to add routing, continue with the multi-agent flow in the docs.

Package Surfaces

  • swarmforge.swarm Runtime models, session state, orchestration, tool execution, and stores.
  • swarmforge.authoring Prompt templates, payload validation, and graph compilation helpers.
  • swarmforge.evaluation Graph snapshots, scenario generation, feasibility checks, and artifact scoring.
  • swarmforge.api FastAPI application factory built on the same runtime primitives.

Providers

SwarmForge ships with an OpenAI-compatible provider wrapper. OpenRouter is the default path, and Gemini is built in as an alternative mode.

Start from the repository .env.example and explicitly set both the provider and the model you want to use.

OpenRouter .env:

MODEL_PROVIDER=openrouter
LLM_MODEL=openrouter/auto
OPENROUTER_API_KEY=sk-or-...
OPENROUTER_SITE_URL=https://your-app.example
OPENROUTER_APP_NAME="Your App Name"

Gemini .env:

MODEL_PROVIDER=gemini
LLM_MODEL=gemini-3-flash-preview
GEMINI_API_KEY=...

Minimal client setup:

from swarmforge.evaluation.provider import ModelConfig, OpenAIClientWrapper

client = OpenAIClientWrapper(ModelConfig())

ModelConfig() reads MODEL_PROVIDER, LLM_MODEL, and the matching API key from .env or the shell environment.

FastAPI Transport

You can expose the runtime over HTTP without changing your swarm definitions:

pip install "swarmforge[api]"
uvicorn swarmforge.api.fastapi:create_fastapi_app --factory --reload

That app exposes both stateless run endpoints and session-backed endpoints with SSE streaming.

Documentation

Source Examples

The repository includes end-to-end example scripts under examples/. Those scripts are useful when you want runnable reference flows for authoring, orchestration, evaluation, provider integration, or FastAPI transport. Provider-backed examples and the local FastAPI example read from .env.example-style settings.

The demo UI under demo-ui/ reads the same root .env for its default API base, provider, and model. Its Vite scripts create .env from .env.example automatically when the file is missing.

Contributing

Core modification, docs development, demo UI work, and PyPI release steps are documented in CONTRIBUTING.md.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

swarmforge-0.8.0.tar.gz (66.7 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

swarmforge-0.8.0-py3-none-any.whl (59.9 kB view details)

Uploaded Python 3

File details

Details for the file swarmforge-0.8.0.tar.gz.

File metadata

  • Download URL: swarmforge-0.8.0.tar.gz
  • Upload date:
  • Size: 66.7 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.13.12

File hashes

Hashes for swarmforge-0.8.0.tar.gz
Algorithm Hash digest
SHA256 de546ee1e4f94954e336fff1cdf04b752e0a89c40091ce9289e547b650f6c051
MD5 771e7e2ce3edc7fe03f4f447634d552b
BLAKE2b-256 66b83c5e7c5960ea0b78921bcea44243c5f6f95375cb703f3a25fe4efb5daee7

See more details on using hashes here.

Provenance

The following attestation bundles were made for swarmforge-0.8.0.tar.gz:

Publisher: release.yml on Rvey/swarm-forge

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file swarmforge-0.8.0-py3-none-any.whl.

File metadata

  • Download URL: swarmforge-0.8.0-py3-none-any.whl
  • Upload date:
  • Size: 59.9 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.13.12

File hashes

Hashes for swarmforge-0.8.0-py3-none-any.whl
Algorithm Hash digest
SHA256 c1b334db7a624e950eea4df5cc9477ba3d4a728d0eb51c7326ffb206e5f428fb
MD5 ceef9a2f87812f1b2c6dd8ba3d829d40
BLAKE2b-256 5ca17fea0e1d00ed202fa64e637231d44b7a043df1672132fb1ed36b89527faf

See more details on using hashes here.

Provenance

The following attestation bundles were made for swarmforge-0.8.0-py3-none-any.whl:

Publisher: release.yml on Rvey/swarm-forge

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page