Skip to main content

Orchestrate distributed swarms of AI agents that collaboratively solve complex tasks.

This project has been archived.

The maintainers of this project have marked this project as archived. No new releases are expected.

Project description

Hivemind

Hivemind

Distributed AI Swarm Runtime

PyPI License: GPL v3 Python 3.12+

Orchestrate multi-agent systems with a swarm execution model: tasks → DAG → parallel execution.

Install: PyPI package hivemind-ai · CLI command hivemind


Features

  • Planner → Scheduler → Executor → Agents — DAG-based execution with configurable parallelism
  • Strategy-based planning — Auto-selected strategies (research, code, data science, document, experiment) or LLM fallback
  • 120+ tools — Research, coding, data science, documents, experiments, memory; smart tool selection (top-k by similarity)
  • TOML confighivemind.toml / workflow.hivemind.toml with Pydantic validation; env > project > user > defaults
  • Memory & knowledge graph — Episodic, semantic, research, artifact memory; summarization, namespaces, entity/relationship search
  • Map-reduce runtimeswarm.map_reduce(dataset, map_fn, reduce_fn) using the worker pool
  • Workflows — Define steps in workflow.hivemind.toml; run with hivemind workflow <name>
  • Plugin ecosystem — Discover tools via entry_points (hivemind.plugins)
  • Provider routing — OpenAI, Anthropic, Azure, Gemini, GitHub Models (Copilot) (provider:model or model name)
  • Automatic model routingplanner = "auto" and worker = "auto" for cost/latency/quality-aware selection
  • EventLog, replay, telemetry — Structured events for debugging and metrics
  • CLI & TUIhivemind init, hivemind doctor, hivemind run, hivemind research, hivemind analyze, hivemind memory, hivemind query, hivemind workflow, hivemind tui (dashboard: tasks, swarm graph, memory, activity feed, knowledge graph, logs)

Architecture

    Planner
       ↓
    Scheduler
       ↓
    Executor
       ↓
    Agents  →  Tools  →  Memory  →  Knowledge Graph

Quickstart

Install (Python 3.12+):

pip install hivemind-ai
# or: uv add hivemind-ai

New project:

export GITHUB_TOKEN=...   # or OPENAI_API_KEY, ANTHROPIC_API_KEY, etc.
hivemind init
hivemind run "analyze this repository"

Run a task:

hivemind run "Summarize swarm intelligence in one paragraph."

In code (config file):

from hivemind import Swarm

swarm = Swarm(config="hivemind.toml")
results = swarm.run("Analyze diffusion models and write a one-page summary.")

Or explicit parameters:

from hivemind import Swarm

swarm = Swarm(worker_count=4, worker_model="gpt-4o-mini", planner_model="gpt-4o-mini", use_tools=True)
results = swarm.run("Analyze diffusion models and write a one-page summary.")

Set API keys via environment or config (see Configuration).


CLI

Command Description
hivemind init Set up a new project (hivemind.toml, example workflow, dataset folder)
hivemind doctor Verify environment (GITHUB_TOKEN, OpenAI keys, config, tool registry)
hivemind run "task" Run swarm with the given task
hivemind tui Terminal UI (prompt, output, dashboard)
hivemind research papers/ Literature review on a directory of papers
hivemind analyze repo/ Analyze repository architecture
hivemind memory [--limit N] List memory entries
hivemind query "…" Query knowledge graph (entity search, relationships)
hivemind workflow <name> Run a workflow from workflow.hivemind.toml

TUI: Prompt + Enter or r to run; d for dashboard (tasks, swarm graph, memory, activity feed, knowledge graph, logs). Esc unfocus, o output, q quit.


Examples

Workflow Command
Literature review hivemind research papers/ or uv run python examples/research/literature_review.py [dir]
Repository analysis hivemind analyze . or uv run python examples/coding/analyze_repository.py [path]
Dataset analysis uv run python examples/data_science/dataset_analysis.py [path-to.csv]
Document intelligence uv run python examples/documents/analyze_documents.py [dir]
Parameter sweep uv run python examples/experiments/parameter_sweep.py --params '{"lr":[0.01,0.1]}'

Outputs under examples/output/. Run from project root when using script paths.


Configuration

Priority: env > project config > user ~/.config/hivemind/config.toml > defaults.

Locations: ./hivemind.toml, ./workflow.hivemind.toml, ~/.config/hivemind/config.toml, or legacy .hivemind/config.toml.

GitHub Models (Copilot): Use provider:model and set GITHUB_TOKEN. Example: github:gpt-4o, github:claude-3.5-sonnet, github:phi-3.

Automatic model routing: Set planner = "auto" and worker = "auto" in [models]; the router picks by task type (planning → quality, fast → cost).

Example hivemind.toml:

[swarm]
workers = 6
adaptive_planning = true
max_iterations = 10

[models]
planner = "auto"
worker = "auto"

[memory]
enabled = true
store_results = true
top_k = 5

[tools]
enabled = ["research", "coding", "documents"]
top_k = 12

[telemetry]
enabled = true
save_events = true

[providers.azure]
endpoint = ""
deployment = ""

Legacy [default] with worker_model, planner_model, events_dir, data_dir is still supported. Env overrides: HIVEMIND_WORKER_MODEL, HIVEMIND_PLANNER_MODEL, HIVEMIND_EVENTS_DIR, HIVEMIND_DATA_DIR, plus provider keys. See docs/providers.md, docs/configuration.md, docs/development.md.


Documentation

Doc Description
Introduction What Hivemind is, problem, core concepts
Architecture Planner, Scheduler, Executor, Agents, Tools, Memory, strategies, config, map-reduce
Configuration TOML schema, locations, env overrides
Swarm runtime Task lifecycle, flow, map-reduce
Tools Registry, runner, smart selection, plugins
Memory Types, store, retrieval, summarization, namespaces, knowledge graph
Providers Provider routing, model spec, Azure, GitHub Models, auto routing
CLI Commands: run, tui, research, analyze, memory, query, workflow, init, doctor
TUI Layout, panels, shortcuts
Examples Workflows and commands
Development Structure, adding tools/plugins/workflows
Contributing Setup, testing, PR guidelines
FAQ Common questions

Contributing

Contributions welcome. See CONTRIBUTING.md.


License

GPL-3.0-or-later — see LICENSE.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

hivemind_ai-1.1.1.tar.gz (2.5 MB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

hivemind_ai-1.1.1-py3-none-any.whl (264.3 kB view details)

Uploaded Python 3

File details

Details for the file hivemind_ai-1.1.1.tar.gz.

File metadata

  • Download URL: hivemind_ai-1.1.1.tar.gz
  • Upload date:
  • Size: 2.5 MB
  • Tags: Source
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.13.7

File hashes

Hashes for hivemind_ai-1.1.1.tar.gz
Algorithm Hash digest
SHA256 dabc1b06e9cae032980be65cd196cf977655d22fa2867e34b59d887206c3407a
MD5 ecdcca2ab259eef367e601f53dbf5111
BLAKE2b-256 dfeabaa396b5a9f6468a92eec1195c5454e0184628098d951737db0e6bb62977

See more details on using hashes here.

Provenance

The following attestation bundles were made for hivemind_ai-1.1.1.tar.gz:

Publisher: pypi-publish.yml on rithulkamesh/hivemind

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file hivemind_ai-1.1.1-py3-none-any.whl.

File metadata

  • Download URL: hivemind_ai-1.1.1-py3-none-any.whl
  • Upload date:
  • Size: 264.3 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.13.7

File hashes

Hashes for hivemind_ai-1.1.1-py3-none-any.whl
Algorithm Hash digest
SHA256 75c3c791d35eae2c6378a90c6335f9a86eaef7bbbeb5daadd4fd9650433c8c7f
MD5 e7750102e799245e79059a3a0b45b848
BLAKE2b-256 fcbb17980b15e52a2cbb939329b2612c886bbf921345417166d69c1f4da9b1d9

See more details on using hashes here.

Provenance

The following attestation bundles were made for hivemind_ai-1.1.1-py3-none-any.whl:

Publisher: pypi-publish.yml on rithulkamesh/hivemind

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page