Skip to main content

Pool Apple Silicon Macs for distributed compute and ML training

Project description

MacFleet

Distributed ML training on Apple Silicon. Pool your Macs into a cluster in 5 seconds, run PyTorch or MLX across them, keep zero cloud spend.

  macfleet join              macfleet join            macfleet join
 ┌──────────────┐          ┌──────────────┐          ┌──────────────┐
 │  MacBook Pro │◄────────►│  MacBook Air │◄────────►│  Mac Studio  │
 │  M4 Pro      │  WiFi /  │  M4          │  WiFi /  │  M4 Ultra    │
 │  16 GPU cores│  ETH /   │  10 GPU cores│  ETH /   │  60 GPU cores│
 │  48 GB RAM   │  TB4     │  16 GB RAM   │  TB4     │  192 GB RAM  │
 └──────────────┘          └──────────────┘          └──────────────┘
         ▲                          ▲                          ▲
         └──────────────────────────┴──────────────────────────┘
                        Ring AllReduce (gradient sync)

Why MacFleet

Apple Silicon is everywhere. Every researcher, student, and founder has a serious ML machine on their desk. What's missing is a way to team them up.

  • PyTorch on MPS has no distributed story. NCCL is CUDA-only. Gloo is broken on MPS. Single-GPU-on-MPS only.
  • MLX is native but most researchers' code is still PyTorch.
  • Cloud is expensive and the iteration loop is slow.

MacFleet fills that gap. Any two Macs on the same WiFi can pool their GPUs. Security is baked in (HMAC + TLS). Adaptive compression keeps WiFi viable for gradient sync. The framework-agnostic core lets you pick your engine (torch or mlx) per call.

Install

pip install macfleet                    # core
pip install "macfleet[torch]"           # + PyTorch
pip install "macfleet[mlx]"             # + Apple MLX
pip install "macfleet[all]"             # everything

The 5-minute path

1. Scaffold a starter script:

macfleet quickstart
# Wrote my_macfleet_demo.py

2. Run it:

python my_macfleet_demo.py
# Pool world size: 1
# Training done: {'loss': 0.31, 'epochs': 10, 'time_sec': 1.4}

3. Pair a second Mac:

On Mac #1:

macfleet join --bootstrap
# prints a QR code + pairing URL, also copies URL to pasteboard

On Mac #2 (same Apple ID → Handoff pasteboard sync):

macfleet pair && macfleet join

Or: scan the QR from Mac #1 with your iPhone camera. Tap the link. Done.

4. Set enable_pool_distributed=True in your script — training now spans both Macs.

Features

  • Dual engine — PyTorch (MPS) and Apple MLX, same pool infrastructure
  • Zero config — mDNS discovery, no coordinator setup, no config files
  • Safe task dispatch@macfleet.task registry + msgpack args (no cloudpickle on the wire)
  • Adaptive compression — auto-selects TopK + FP16 based on link speed (1x–200x reduction)
  • Heterogeneous scheduling — faster Macs get bigger batches, adjusts for thermal throttling
  • Secure by default — auto-generated fleet tokens, HMAC mutual auth, mandatory TLS, per-IP rate limiting
  • Framework-agnostic core — communication layer uses only numpy, never imports torch or mlx

Security

Security is on by default. The first macfleet join auto-generates a fleet token at ~/.macfleet/token (mode 0600). See the security reference for the full threat model.

Short version:

  • Fleet isolation — nodes with different tokens can't see each other on the network (mDNS service type is scoped by fleet hash)
  • Mutual authentication — HMAC-SHA256 challenge-response on every connection, plus signed hardware profile exchange (v2.2)
  • Encryption — TLS mandatory whenever auth is enabled
  • Rate limiting — 5 failed auth attempts per IP → 5-minute ban, exponential backoff in between (heartbeat read timeout tightened to 1s to stop slowloris)
  • No cloudpickle over the wire@macfleet.task routes registered callables by name, not by pickled closures

CLI

macfleet join         Join the pool (auto-discovers peers)
macfleet pair         Read a pairing URL from pasteboard / stdin
macfleet status       Show pool members and network info
macfleet info         Show local hardware profile
macfleet train        Run training (demo or custom script)
macfleet bench        Benchmark compute, network, or allreduce
macfleet doctor       System health check
macfleet quickstart   Write a starter training script

How it works

MacFleet uses data parallelism: every Mac holds a full copy of the model, trains on a weighted portion of the data, and averages gradients via Ring AllReduce after each step.

Network Compression 100 MB gradients become
Thunderbolt 4 None 100 MB
Ethernet TopK 10% + FP16 ~5 MB
WiFi TopK 1% + FP16 ~500 KB

Requirements

  • macOS 14+ with Apple Silicon (M1/M2/M3/M4)
  • Python 3.11+
  • PyTorch 2.1+ or MLX 0.5+ (optional, pick your engine)

Documentation

Full docs: run mkdocs serve after pip install "macfleet[docs]", or read the Markdown source in docs/:

Development

git clone https://github.com/vikranthreddimasu/MacFleet.git
cd MacFleet
pip install -e ".[dev,all]"
make test       # 425+ tests
make lint       # ruff + mypy

License

MIT

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

macfleet-2.2.0rc1.tar.gz (107.9 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

macfleet-2.2.0rc1-py3-none-any.whl (126.3 kB view details)

Uploaded Python 3

File details

Details for the file macfleet-2.2.0rc1.tar.gz.

File metadata

  • Download URL: macfleet-2.2.0rc1.tar.gz
  • Upload date:
  • Size: 107.9 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.13.12

File hashes

Hashes for macfleet-2.2.0rc1.tar.gz
Algorithm Hash digest
SHA256 c10f9c3ab2693dd73b83142341f8560b952f564df729e553777be8cd9480b65a
MD5 0bcefd3f902ac69834f095c8862b1ee2
BLAKE2b-256 73181f97323da1c323a95d35998003ebbd9f86eb11bd744881b93ffed81b656d

See more details on using hashes here.

Provenance

The following attestation bundles were made for macfleet-2.2.0rc1.tar.gz:

Publisher: release.yml on vikranthreddimasu/MacFleet

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file macfleet-2.2.0rc1-py3-none-any.whl.

File metadata

  • Download URL: macfleet-2.2.0rc1-py3-none-any.whl
  • Upload date:
  • Size: 126.3 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.13.12

File hashes

Hashes for macfleet-2.2.0rc1-py3-none-any.whl
Algorithm Hash digest
SHA256 bc9a8765095f99db3e8f2ad80223338b2371420d9e3da43396cdd3ea99b603d1
MD5 a5723c7d49f8c7fc25b7c9300eb4eb58
BLAKE2b-256 363a3c87f7f0d7e9c5681ca637279324edd48813ec3b0e6ea60f3bc203c41df8

See more details on using hashes here.

Provenance

The following attestation bundles were made for macfleet-2.2.0rc1-py3-none-any.whl:

Publisher: release.yml on vikranthreddimasu/MacFleet

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page