Skip to main content

Async multi-model conversation orchestration patterns (fan_out, daisy_chain, room_all, room_synthesized, council, roleplay)

Project description

Emergent — Multi-Model AI Hub

Live App PyPI License: MIT

Chat with GPT, Claude, Gemini, Grok, DeepSeek, and Perplexity in parallel. Compare, synthesize, and chain their responses — all in one interface.


Try it live

emergentapp.interdependentway.org

Tier Price What you get
Free $0 Explore the interface
Core $15/month Full EDCM analytics, API key management, cost tracking
Founder $153 one-time Locked $15 rate forever, founder badge, early access — 53 slots only
Compute credits $10–$50 Pay-per-use compute blocks

What is this?

Emergent is a multi-model AI hub that lets you send one prompt to multiple LLMs simultaneously and work with their responses together. Instead of copy-pasting between chat tabs, you get a single interface with color-coded, side-by-side or stacked responses from every model you care about.

Beyond simple fan-out, Emergent supports structured interaction patterns — synthesis (feed multiple responses into one model for analysis), shared rooms (models that see and respond to each other), daisy chains (A→B→C sequential pipelines), council mode, and roleplay scenarios. The EDCM engine analyzes conversation transcripts across six cognitive metrics and surfaces actionable insights.


Interaction Patterns

Pattern What it does
Fan-out Send one prompt to N models in parallel
Synthesis Select responses, send to a synthesis model for analysis
Shared Room (All) All models see each other's responses and reply in rounds
Shared Room (Synthesized) Responses synthesized first, then drive the next round
Daisy Chain Model A → B → C sequentially, each seeing the previous response
Council Each model synthesizes all responses including its own
Roleplay DM-driven roleplay with initiative ordering and reactions

Self-hosting

Backend (FastAPI + MongoDB)

cd backend
pip install -r requirements.txt

# Required env vars
export MONGO_URI="mongodb://localhost:27017"
export JWT_SECRET="your-secret"
export STRIPE_SECRET_KEY="sk_..."        # optional: for payments
export STRIPE_WEBHOOK_SECRET="whsec_..." # optional: for webhooks

uvicorn server:app --reload

Frontend (React)

cd frontend
npm install
npm start

The frontend expects the backend at http://localhost:8000 by default.


aimmh-lib — the open-source core

The orchestration patterns are extracted into a standalone, zero-dependency Python library.

pip install aimmh-lib
import asyncio
from aimmh_lib import fan_out

async def call_model(model_id: str, messages: list[dict]) -> str:
    # plug in any model backend here
    return f"Response from {model_id}"

async def main():
    results = await fan_out(
        call_fn=call_model,
        model_ids=["gpt-4o", "claude-3-5-sonnet", "gemini-1.5-pro"],
        messages=[{"role": "user", "content": "What is the best programming language?"}],
    )
    for r in results:
        print(f"{r.model_id}: {r.content}")

asyncio.run(main())

All six patterns available: fan_out, daisy_chain, room_all, room_synthesized, council, roleplay.

PyPI →


Tech Stack

Backend: FastAPI · Motor (async MongoDB) · asyncio · Stripe · Google OAuth · JWT

Frontend: React · Tailwind CSS · Shadcn UI · React Router

Library: Pure Python 3.11+ · zero runtime dependencies


Repository Structure

aimmh_lib/   # pip install aimmh-lib — zero-dep async orchestration library
backend/     # FastAPI service (auth, multi-model chat, payments, EDCM)
frontend/    # React UI

License

aimmh_lib/ is MIT licensed. The backend and frontend are proprietary — you may self-host for personal use but may not offer them as a competing hosted service.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

aimmh_lib-0.1.0.tar.gz (12.7 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

aimmh_lib-0.1.0-py3-none-any.whl (11.6 kB view details)

Uploaded Python 3

File details

Details for the file aimmh_lib-0.1.0.tar.gz.

File metadata

  • Download URL: aimmh_lib-0.1.0.tar.gz
  • Upload date:
  • Size: 12.7 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.13.12

File hashes

Hashes for aimmh_lib-0.1.0.tar.gz
Algorithm Hash digest
SHA256 545d6ba1816523414e4935c6e64a60e87d039bce9e6f2fd63000b101e1c6346c
MD5 f6c66f29e73829090365802b0ba55e69
BLAKE2b-256 ab8e77f9a5b09a9b808d621d8a6b0cb453395a205a7f92900791c8651c3897b0

See more details on using hashes here.

File details

Details for the file aimmh_lib-0.1.0-py3-none-any.whl.

File metadata

  • Download URL: aimmh_lib-0.1.0-py3-none-any.whl
  • Upload date:
  • Size: 11.6 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.13.12

File hashes

Hashes for aimmh_lib-0.1.0-py3-none-any.whl
Algorithm Hash digest
SHA256 e737f9901819fa8c2970f00f3a92fc2ae091644f9f9c1478c343ae58badf0850
MD5 f537343e5a902deeb41dd186a3e3a144
BLAKE2b-256 80523086eda914cb55bb1e76eac0ad1a4bf2d9bb7c69af4da2d688e37da71d0f

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page