Pool Apple Silicon Macs for distributed compute and ML training
Project description
MacFleet
Pool Apple Silicon Macs into a distributed ML training cluster.
Turn spare MacBooks, Mac Minis, and Mac Studios into one big GPU. MacFleet connects them over Thunderbolt, Ethernet, or WiFi and splits training across all of them automatically.
macfleet join macfleet join macfleet join
┌──────────────┐ ┌──────────────┐ ┌──────────────┐
│ MacBook Pro │◄────────►│ MacBook Air │◄────────►│ Mac Studio │
│ M4 Pro │ WiFi / │ M4 │ WiFi / │ M4 Ultra │
│ 16 GPU cores│ ETH / │ 10 GPU cores│ ETH / │ 60 GPU cores│
│ 48 GB RAM │ TB4 │ 16 GB RAM │ TB4 │ 192 GB RAM │
└──────────────┘ └──────────────┘ └──────────────┘
▲ ▲ ▲
└──────────────────────────┴──────────────────────────┘
Ring AllReduce (gradient sync)
Install
pip install macfleet # core
pip install macfleet[torch] # + PyTorch
pip install macfleet[mlx] # + Apple MLX
pip install macfleet[all] # everything
Quick Start
1. Join the pool (run on each Mac):
macfleet join
No config files, no IP addresses. Macs find each other automatically via mDNS/Bonjour.
2. Train:
import macfleet
import torch.nn as nn
model = nn.Sequential(nn.Linear(784, 256), nn.ReLU(), nn.Linear(256, 10))
with macfleet.Pool() as pool:
result = pool.train(model=model, dataset=(X_train, y_train), epochs=10)
Features
- Dual engine — PyTorch (MPS) and Apple MLX, same pool infrastructure
- Zero config — mDNS discovery, no coordinator setup, no config files
- Adaptive compression — auto-selects TopK + FP16 based on link speed (1x–200x reduction)
- Heterogeneous scheduling — faster Macs get bigger batches, adjusts for thermal throttling
- Secure by default — auto-generated fleet tokens, HMAC mutual auth, mandatory TLS, gradient validation
- Framework-agnostic core — communication layer uses only numpy, never imports torch or mlx
Security
Security is enabled by default. The first macfleet join auto-generates a fleet token and saves it to ~/.macfleet/fleet-token:
macfleet join # auto-generates token, prints it
macfleet join --token <token> # join with a specific token (copy from first node)
macfleet join --fleet-id lab # isolate by fleet name
macfleet join --open # disable security (not recommended)
What's protected:
- Fleet isolation — nodes with different tokens are invisible to each other on the network
- Mutual authentication — HMAC-SHA256 challenge-response on every connection
- Encryption — TLS enabled automatically (mandatory with auth)
- Authenticated heartbeat — HMAC-signed liveness probes, replay-resistant
- Gradient validation — rejects NaN, Inf, and extreme magnitudes (anti-poisoning)
CLI
macfleet join Join the pool (auto-discovers peers)
macfleet status Show pool members and network info
macfleet info Show local hardware profile
macfleet train Run training (demo or custom script)
macfleet bench Benchmark compute, network, or allreduce
macfleet diagnose System health check
How It Works
MacFleet uses data parallelism: every Mac holds a full copy of the model, trains on a weighted portion of the data, and averages gradients via Ring AllReduce after each step.
| Network | Compression | 100 MB gradients become |
|---|---|---|
| Thunderbolt 4 | None | 100 MB |
| Ethernet | TopK 10% + FP16 | ~5 MB |
| WiFi | TopK 1% + FP16 | ~500 KB |
Requirements
- macOS with Apple Silicon (M1/M2/M3/M4)
- Python 3.11+
- PyTorch 2.1+ or MLX 0.5+
Development
git clone https://github.com/vikranthreddimasu/MacFleet.git
cd MacFleet
pip install -e ".[dev,all]"
make test # 373 tests
make lint # ruff + mypy
License
MIT
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file macfleet-2.1.0.tar.gz.
File metadata
- Download URL: macfleet-2.1.0.tar.gz
- Upload date:
- Size: 75.7 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.2.0 CPython/3.12.7
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
5d131bad34bc7f59e3b174235caf288d5be24a9bf49ea915f68c91b75728e347
|
|
| MD5 |
f7e1852edcec9fc44078d7fddfe8d424
|
|
| BLAKE2b-256 |
7b6c284d389794abd1ffb2be0c7ec3d36d0a5169094651bec55c8cf600dab331
|
File details
Details for the file macfleet-2.1.0-py3-none-any.whl.
File metadata
- Download URL: macfleet-2.1.0-py3-none-any.whl
- Upload date:
- Size: 91.7 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.2.0 CPython/3.12.7
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
425b7e0299ecd542474c951a5fe03592e17c0d6e1975f6c14e8cd18bbb922584
|
|
| MD5 |
61a6aaa1ef093ce66d92f1390bf66e96
|
|
| BLAKE2b-256 |
0536bb3a6b9a09ddab34bde951e18b9763618d4560c361fa1930ecff0ebfaa45
|