Skip to main content

Orchestrate distributed swarms of AI agents that collaboratively solve complex tasks.

This project has been archived.

The maintainers of this project have marked this project as archived. No new releases are expected.

Project description

Hivemind

Hivemind

Distributed AI Swarm Runtime

PyPI License: GPL v3 Python 3.12+

Orchestrate multi-agent systems with a swarm execution model: tasks → DAG → parallel execution.

Install: PyPI package hivemind-ai · CLI hivemind


Quick start

1. Install (Python 3.12+):

pip install hivemind-ai
# or: uv add hivemind-ai

2. Set up API keys (pick one):

Store credentials in your OS keychain so you never re-enter them:

hivemind credentials set openai api_key      # prompts for value
hivemind credentials set anthropic api_key
hivemind credentials set github token
# or migrate from .env:
hivemind credentials migrate

Or use environment variables: OPENAI_API_KEY, ANTHROPIC_API_KEY, GITHUB_TOKEN, etc. (see Credentials).

3. Create a project and run:

hivemind init
hivemind run "Summarize swarm intelligence in one paragraph."

4. Optional — shell completion:

# Bash: add to ~/.bashrc
eval "$(hivemind completion bash)"

# Zsh: add to ~/.zshrc
eval "$(hivemind completion zsh)"

Run from code

From config file:

from hivemind import Swarm

swarm = Swarm(config="hivemind.toml")
results = swarm.run("Analyze diffusion models and write a one-page summary.")

Explicit parameters:

from hivemind import Swarm

swarm = Swarm(worker_count=4, worker_model="gpt-4o-mini", planner_model="gpt-4o-mini", use_tools=True)
results = swarm.run("Your task here.")

Credentials are injected from the keyring (or env) when config is resolved—no code changes needed.


Credentials

API keys are not stored in config files. Use the credential store (OS keychain) or environment variables.

What you want Command or method
Store a key securely hivemind credentials set <provider> <key> (prompts; uses keyring)
List stored keys (no values) hivemind credentials list
Import from .env / TOML hivemind credentials migrate
Export for sourcing / .env hivemind credentials export <provider> → prints KEY=value lines
Remove a key hivemind credentials delete <provider> <key>

Providers: openai, anthropic, github, gemini, azure, azure_anthropic (keys: api_key, token, endpoint, deployment, api_version as applicable).

Example — export and source in a script:

eval "$(hivemind credentials export azure)"
hivemind run "Your task"

See Configuration and CLI for details.


CLI

Command Description
hivemind init Set up a new project (hivemind.toml)
hivemind doctor Check environment (keys, config, tools)
hivemind run "task" Run swarm on a task
hivemind tui Terminal UI (prompt, dashboard, logs)
hivemind credentials set/list/migrate/export/delete Manage API keys (keyring)
hivemind completion bash | zsh Print shell completion script
hivemind research [path] Literature review on a directory
hivemind analyze [path] Analyze repository architecture
hivemind memory [--limit N] List memory entries
hivemind query "…" Query knowledge graph
hivemind workflow <name> Run a workflow from workflow.hivemind.toml
hivemind graph [run_id] Export task DAG as Mermaid
hivemind replay [run_id] Replay a run from event log
hivemind cache stats | clear Task result cache
hivemind analytics Tool usage stats
hivemind build "app description" [-o dir] Autonomous app builder
hivemind upgrade [--check | -y] Check for updates / upgrade

Run hivemind --help or hivemind <command> --help for examples and options.


Features

  • Planner → Scheduler → Executor → Agents — DAG-based execution with configurable parallelism
  • Strategy-based planning — Auto-selected strategies (research, code, data science, document, experiment) or LLM fallback
  • 120+ tools — Research, coding, data science, documents, experiments, memory; smart tool selection (top-k by similarity)
  • TOML confighivemind.toml / workflow.hivemind.toml; env > project > user > defaults
  • Memory & knowledge graph — Episodic, semantic, research, artifact memory; summarization, namespaces, entity/relationship search
  • Map-reduce runtimeswarm.map_reduce(dataset, map_fn, reduce_fn) using the worker pool
  • Workflows — Define steps in workflow.hivemind.toml; run with hivemind workflow <name>; structured output self-correction (v1.7) retries with a correction prompt when JSON parsing fails
  • Critic & agent messaging (v1.7) — Optional second-pass critic scores results and requests one retry; per-run message bus lets agents share discoveries via BROADCAST:
  • Speculative pre-fetching (v1.7) — Pre-warm memory and tools for successor tasks while others run; reduces standing-up time
  • Plugin ecosystem — Discover tools via entry_points (hivemind.plugins)
  • Provider routing — OpenAI, Anthropic, Azure, Gemini, GitHub Models (Copilot) (provider:model or model name)
  • Automatic model routingplanner = "auto" and worker = "auto" for cost/latency/quality-aware selection
  • EventLog, replay, telemetry — Structured events for debugging and metrics

Architecture

    Planner
       ↓
    Scheduler
       ↓
    Executor
       ↓
    Agents  →  Tools  →  Memory  →  Knowledge Graph

Configuration

Priority: env > project config > user ~/.config/hivemind/config.toml > defaults.

Locations: ./hivemind.toml, ./workflow.hivemind.toml, ~/.config/hivemind/config.toml, or legacy .hivemind/config.toml.

Keep secrets out of TOML. Use hivemind credentials or environment variables for API keys. Non-secret settings (models, workers, paths) go in TOML.

Example hivemind.toml:

[swarm]
workers = 6
adaptive_planning = true
max_iterations = 10
critic_enabled = true
critic_roles = ["research", "analysis", "code"]
message_bus_enabled = true
prefetch_enabled = true

[models]
planner = "auto"
worker = "auto"

[memory]
enabled = true
store_results = true
top_k = 5

[tools]
enabled = ["research", "coding", "documents"]
top_k = 12

[telemetry]
enabled = true
save_events = true

[providers.azure]
endpoint = ""   # or use credentials store / env
deployment = ""

Env overrides: HIVEMIND_WORKER_MODEL, HIVEMIND_PLANNER_MODEL, HIVEMIND_EVENTS_DIR, HIVEMIND_DATA_DIR, plus provider keys. Full schema: docs/configuration.md, docs/providers.md.


Examples

Workflow Command
Literature review hivemind research papers/ or uv run python examples/research/literature_review.py [dir]
Repository analysis hivemind analyze . or uv run python examples/coding/analyze_repository.py [path]
Dataset analysis uv run python examples/data_science/dataset_analysis.py [path-to.csv]
Document intelligence uv run python examples/documents/analyze_documents.py [dir]
Parameter sweep uv run python examples/experiments/parameter_sweep.py --params '{"lr":[0.01,0.1]}'

Outputs under examples/output/. Run from project root when using script paths.


Documentation

Full docs (with versioning and dark mode): hivemind.rithul.dev. Source lives in website/docs/ and is built with Docusaurus.

Doc Description
Introduction What Hivemind is, problem, core concepts
Architecture Planner, Scheduler, Executor, Agents, Tools, Memory, strategies
Configuration TOML schema, locations, env, credentials
Swarm runtime Task lifecycle, flow, map-reduce
Tools Registry, runner, smart selection, plugins
Memory Types, store, retrieval, knowledge graph
Providers Provider routing, Azure, GitHub Models, auto routing
CLI All commands, credentials, completion
TUI Layout, panels, shortcuts
Examples Workflows and commands
Development Structure, adding tools/plugins/workflows
Contributing Setup, testing, PR guidelines
FAQ Common questions

Contributing

Contributions welcome. See CONTRIBUTING.md.


License

GPL-3.0-or-later — see LICENSE.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

hivemind_ai-1.9.0.tar.gz (2.9 MB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

hivemind_ai-1.9.0-py3-none-any.whl (369.6 kB view details)

Uploaded Python 3

File details

Details for the file hivemind_ai-1.9.0.tar.gz.

File metadata

  • Download URL: hivemind_ai-1.9.0.tar.gz
  • Upload date:
  • Size: 2.9 MB
  • Tags: Source
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.13.7

File hashes

Hashes for hivemind_ai-1.9.0.tar.gz
Algorithm Hash digest
SHA256 a6b2eca8613d02f9ce96fec25d5a0ad70331f56d1bdb1364c3d845e71ed4d2ac
MD5 2055a496d6cee547e098c61a22dea3a1
BLAKE2b-256 a645011d19bf79fd8e51e38dde693b70f28565c5b8ad0c88c18e33045c9ae18d

See more details on using hashes here.

Provenance

The following attestation bundles were made for hivemind_ai-1.9.0.tar.gz:

Publisher: release-on-tag.yml on rithulkamesh/hivemind

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file hivemind_ai-1.9.0-py3-none-any.whl.

File metadata

  • Download URL: hivemind_ai-1.9.0-py3-none-any.whl
  • Upload date:
  • Size: 369.6 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.13.7

File hashes

Hashes for hivemind_ai-1.9.0-py3-none-any.whl
Algorithm Hash digest
SHA256 5d3f2b4e4d38653bca2916b64baca0cd15826579286ff362ee728c64f33ccd3d
MD5 100a08f71f34d84448be442a8409f1d0
BLAKE2b-256 d933fd1e26c0cea3a56d1a34dc85f267691530acd12f3908f1389f0309f78ef1

See more details on using hashes here.

Provenance

The following attestation bundles were made for hivemind_ai-1.9.0-py3-none-any.whl:

Publisher: release-on-tag.yml on rithulkamesh/hivemind

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page