Skip to main content

Distributed AI framework โ€” grow your own cluster from whatever hardware you've got

Project description

๐Ÿ„ mycoSwarm

Distributed AI for everyone. Turn forgotten hardware into a thinking network.

mycoSwarm connects your machines โ€” old laptops, mini PCs, Raspberry Pis, GPU workstations โ€” into a single AI swarm. No cloud. No API keys. No data leaves your network.

mycoSwarm Demo

curl -fsSL https://raw.githubusercontent.com/msb-msb/mycoSwarm/main/scripts/install.sh | bash
mycoswarm chat

That's it. Two commands. You're running local AI.


Dashboard

mycoSwarm Dashboard

Live swarm monitoring โ€” 5 nodes, 86.6 GB RAM, all from rescued hardware under $1,100.


What It Does

One machine? Chat with local models instantly โ€” no daemon, no config.

Multiple machines? They find each other automatically via mDNS, share capabilities, and route tasks to the right hardware. A $50 mini PC can chat with a 27B model running on a GPU across the room.

The weakest machine in the swarm gets access to the strongest model.

Real Example: 5-Node Swarm

Node Hardware Cost Role
Miu RTX 3090, 64GB RAM ~$850 (used) GPU inference โ€” runs 27B models
naru Lenovo M710Q, 8GB RAM $50 Web search, file processing
uncho Lenovo M710Q, 8GB RAM $50 Web search, coordination
boa Lenovo M710Q, 8GB RAM $50 Web search, code execution
raspberrypi Raspberry Pi 2, 1GB RAM $35 Search, lightweight tasks

Total: ~$1,035. Zero monthly fees.


Features

Chat with memory โ€” Persistent facts and session history across conversations. Your AI remembers what you tell it.

Research โ€” Ask a question, the swarm plans multiple searches, distributes them across CPU workers in parallel, and synthesizes a cited answer on the GPU. Faster than any single machine.

Document library (RAG) โ€” Drop files into ~/mycoswarm-docs/. The swarm indexes them and answers questions about your documents with citations.

Agentic tool routing โ€” The model automatically decides when it needs web search or document lookup, shows you what it's doing, and uses the results. No manual tool selection.

Honest AI โ€” When it doesn't know something, it says so. No hallucinated weather forecasts or fabricated facts.

Identity โ€” Persistent self-model with first-run naming. Your AI remembers its own name across sessions.

Self-awareness (8 C's) โ€” Real-time vital signs after every response: Calm, Clarity, Curiosity, Compassion, Courage, Creativity, Connectedness, Confidence. Derived from pipeline signals, not simulated.

Wu Wei Timing Gate โ€” Contextual response calibration. Late night โ†’ shorter, warmer. Exploration mode โ†’ deeper, expansive. No LLM call, pure heuristics.

Procedural memory โ€” The swarm learns from experience. Wisdom procedures surface automatically when similar problems recur.

Intent classification โ€” Pre-inference routing decides tool, mode, and scope before the model runs.

Plugin system โ€” Drop a folder into ~/.config/mycoswarm/plugins/ and your node advertises a new capability. No core code changes.


Install

Quick Start (Linux or macOS)

curl -fsSL https://raw.githubusercontent.com/msb-msb/mycoSwarm/main/scripts/install.sh | bash
mycoswarm chat

The installer detects your OS, installs Python and Ollama if needed, pulls a model sized for your RAM, and runs hardware detection.

Manual Install

pip install mycoswarm
mycoswarm chat

Requires Ollama running with at least one model pulled.

macOS (Apple Silicon)

brew install ollama
ollama serve &
ollama pull gemma3:27b  # or gemma3:4b for 8GB Macs
pip install mycoswarm
mycoswarm chat

Apple Silicon unified memory is detected automatically โ€” an M1 with 16GB can run 14B+ models.

Raspberry Pi

Works on Pi 2 and newer. pymupdf (PDF support) is optional โ€” if it fails to build on ARM, PDF reading is disabled but everything else works.

sudo apt install -y python3-venv git
git clone https://github.com/msb-msb/mycoSwarm.git
cd mycoSwarm
python3 -m venv .venv
source .venv/bin/activate
pip install -e .
mycoswarm detect

Pi nodes can't run inference (no GPU, limited RAM) but contribute as web search workers, file processors, and coordinators.


Growing the Swarm

Single-node mode works out of the box. When you're ready for more:

Start the Daemon

mycoswarm daemon

Or install as a service (Linux):

sudo cp scripts/mycoswarm.service /etc/systemd/system/
sudo systemctl daemon-reload
sudo systemctl enable --now mycoswarm

Add Another Machine

Install mycoSwarm on the second machine, start the daemon. That's it. mDNS handles discovery โ€” no IP addresses to configure, no config files to edit. Within seconds:

mycoswarm swarm

Shows both nodes, their capabilities, and available models.

How Routing Works

The orchestrator scores each node for each task type:

  • Inference โ†’ GPU nodes (highest VRAM wins)
  • Web search / file processing โ†’ CPU workers (distributed round-robin)
  • Embeddings โ†’ Nodes running Ollama with embedding models
  • Code execution โ†’ CPU workers (sandboxed subprocess)

Tasks go to the best available node. If that node fails, the orchestrator retries on the next candidate. Executive (GPU) nodes are reserved for inference โ€” they won't waste cycles on web searches when CPU workers are available.


CLI Commands

Command What It Does
mycoswarm chat Interactive chat with memory, tools, and document search
mycoswarm ask "prompt" Single question, streamed response
mycoswarm research "topic" Parallel web search โ†’ synthesized answer with citations
mycoswarm rag "question" Answer from your indexed documents
mycoswarm search "query" Raw web search results
mycoswarm library ingest [path] Index files for document search
mycoswarm library list Show indexed documents
mycoswarm detect Show hardware and capabilities
mycoswarm swarm Swarm overview โ€” all nodes and status
mycoswarm models All models across the swarm
mycoswarm plugins Installed plugins
mycoswarm memory View and manage stored facts
mycoswarm daemon Start the swarm daemon

Chat Slash Commands

Command What It Does
/remember <fact> Store a persistent fact
/memories Show all stored facts
/forget <n> Remove a fact by number
/rag <question> Search documents and answer
/library Show indexed documents
/auto Toggle agentic tool routing on/off
/identity View name, origin, substrate
/name <n> Set or change AI name
/vitals Detailed 8 C's breakdown with bar charts
/timing Current timing gate state and reasons
/stale Show facts approaching decay threshold
/procedure View stored wisdom procedures
/model Switch model
/clear Reset conversation
/quit Save session and exit

Architecture

src/mycoswarm/
โ”œโ”€โ”€ hardware.py      # GPU/CPU/RAM/disk/Ollama detection (Linux, macOS, ARM)
โ”œโ”€โ”€ capabilities.py  # Node classification โ€” tiers, capabilities, model limits
โ”œโ”€โ”€ node.py          # Persistent node identity (UUID survives restarts)
โ”œโ”€โ”€ discovery.py     # mDNS auto-discovery, peer health tracking
โ”œโ”€โ”€ api.py           # FastAPI service โ€” health, status, peers, tasks, SSE streaming
โ”œโ”€โ”€ daemon.py        # Main daemon โ€” detection + discovery + API + worker + orchestrator
โ”œโ”€โ”€ worker.py        # Task handlers โ€” inference, search, embedding, files, code, translate
โ”œโ”€โ”€ orchestrator.py  # Task routing โ€” scoring, retry, load balancing, inflight tracking
โ”œโ”€โ”€ plugins.py       # Plugin loader โ€” scan ~/.config/mycoswarm/plugins/
โ”œโ”€โ”€ solo.py          # Single-node mode โ€” direct Ollama, agentic classification
โ”œโ”€โ”€ library.py       # Document library โ€” chunking, embeddings, ChromaDB, RAG
โ”œโ”€โ”€ memory.py        # Persistent memory โ€” facts, session summaries, prompt injection
โ”œโ”€โ”€ identity.py      # Persistent self-model โ€” name, origin, development stage
โ”œโ”€โ”€ timing.py        # Wu Wei Timing Gate โ€” PROCEED/GENTLE/DEEP calibration
โ”œโ”€โ”€ vitals.py        # 8 C's vital signs โ€” self-awareness from pipeline signals
โ””โ”€โ”€ cli.py           # All CLI commands and interactive chat

Node Tiers

Tier Example Hardware Role
EXECUTIVE RTX 3090 workstation GPU inference, orchestration
SPECIALIST RTX 3060 desktop GPU inference (smaller models)
LIGHT Lenovo M710Q, Raspberry Pi Web search, file processing, coordination
WORKER Any CPU-only machine Distributed task execution

Discovery

Nodes broadcast via mDNS (_mycoswarm._tcp.local.). No central server, no configuration. Plug in a machine, start the daemon, the swarm grows.

Task Flow

User asks question on Node A
  โ†’ Node A checks: can I handle this locally?
    โ†’ Yes: execute locally
    โ†’ No: orchestrator scores all peers
      โ†’ Dispatch to best peer
      โ†’ Stream response back to Node A

Plugins

Extend the swarm without touching core code. Drop a directory into ~/.config/mycoswarm/plugins/:

~/.config/mycoswarm/plugins/
โ””โ”€โ”€ my_summarizer/
    โ”œโ”€โ”€ plugin.yaml
    โ””โ”€โ”€ handler.py

plugin.yaml:

name: my_summarizer
task_type: summarize
description: Summarize text by extracting key points
capabilities: cpu_worker

handler.py:

async def handle(task):
    text = task.payload.get("text", "")
    # Your logic here
    return {"summary": summarized_text}

Restart the daemon. The node advertises the new capability. Other nodes can route summarize tasks to it.


Document Library

Drop files into ~/mycoswarm-docs/ and index them:

mycoswarm library ingest

Supports: PDF, Markdown, TXT, HTML, CSV, JSON.

Files are chunked, embedded (via Ollama), and stored in ChromaDB. Ask questions:

mycoswarm rag "what does the architecture section describe?"

Or use /rag in chat for inline document search.


The Manifesto

Named after mycelium โ€” the underground network connecting a forest. It doesn't centralize. It finds what's available and connects it.

If a student in Lagos with two old laptops can't participate, the framework has failed.

No cloud dependencies. No API keys. No expensive hardware requirements. Every node counts.


What's Next

  • Identity development โ€” Monica grows through interaction, not just configuration
  • Swarm identity sync โ€” Consistent self-model across all nodes
  • Agentic timing gate โ€” SUPPRESS/DEFER/PROCEED for proactive actions
  • Agentic planner โ€” LLM generates multi-step plans and executes them across the swarm
  • mTLS security โ€” Encrypted, authenticated inter-node communication
  • Config files โ€” ~/.config/mycoswarm/config.toml for persistent settings
  • Mesh networking โ€” Connect swarms across the internet via VPN

Contributing

mycoSwarm is MIT licensed. Contributions welcome.

git clone https://github.com/msb-msb/mycoSwarm.git
cd mycoSwarm
python3 -m venv .venv
source .venv/bin/activate
pip install -e ".[dev]"
python -m pytest tests/ -v  # 398 tests, all offline

v0.2.9 | 398 tests | 5 nodes โ€” Built with experience, not hype. InsiderLLM

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

mycoswarm-0.3.4.tar.gz (213.3 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

mycoswarm-0.3.4-py3-none-any.whl (166.7 kB view details)

Uploaded Python 3

File details

Details for the file mycoswarm-0.3.4.tar.gz.

File metadata

  • Download URL: mycoswarm-0.3.4.tar.gz
  • Upload date:
  • Size: 213.3 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.12.3

File hashes

Hashes for mycoswarm-0.3.4.tar.gz
Algorithm Hash digest
SHA256 81cfbdfca3c0baef991a46c395a4a0d55f52af48febd277a0460f60ea3bb3228
MD5 eb22391f9e11a16bd514ec4361e7e03a
BLAKE2b-256 12d76a7f9391fdaf07042759d65beeeb048d976f0c0141f89a754ddce4149a64

See more details on using hashes here.

File details

Details for the file mycoswarm-0.3.4-py3-none-any.whl.

File metadata

  • Download URL: mycoswarm-0.3.4-py3-none-any.whl
  • Upload date:
  • Size: 166.7 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.12.3

File hashes

Hashes for mycoswarm-0.3.4-py3-none-any.whl
Algorithm Hash digest
SHA256 fca99b13da5de90ede7a4db0be134c6475e003df7d9f5a20bb5b051f45c6d387
MD5 3044d4b2730aa57ea5369d10bbc26e32
BLAKE2b-256 e0e24fa1dd5d2cc390c3e44a59faa367bf49756a2b37c99a8230583d1b685b3b

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page