Pool Apple Silicon Macs into a distributed ML training cluster
Project description
MacFleet
Pool Apple Silicon Macs into a distributed ML training cluster.
Zero-config discovery. N-node scaling. WiFi, Ethernet, or Thunderbolt.
macfleet join macfleet join macfleet join
┌──────────────┐ ┌──────────────┐ ┌──────────────┐
│ MacBook Pro │◄────────►│ MacBook Air │◄────────►│ Mac Studio │
│ M4 Pro │ WiFi / │ M4 │ WiFi / │ M4 Ultra │
│ 16 GPU cores │ ETH / │ 10 GPU cores │ ETH / │ 60 GPU cores │
│ 48 GB RAM │ TB4 │ 16 GB RAM │ TB4 │ 192 GB RAM │
│ weight: 0.35 │ │ weight: 0.15 │ │ weight: 0.50 │
└──────────────┘ └──────────────┘ └──────────────┘
▲ ▲ ▲
└──────────────────────────┴──────────────────────────┘
Ring AllReduce (gradient sync)
Features
- Zero-Config Pooling:
pip install macfleet && macfleet join— auto-discovers peers via mDNS/Bonjour - N-Node Scaling: Ring AllReduce for 2+ nodes (not limited to pairs)
- Any Network: WiFi, Ethernet, and Thunderbolt with adaptive buffer tuning
- Dual Engine: PyTorch+MPS and Apple MLX — pluggable via Engine protocol
- Heterogeneous Scheduling: Weighted batch allocation based on GPU cores + thermal state
- Gossip Heartbeat: Peer-to-peer failure detection, automatic coordinator election
- Adaptive Compression: Bandwidth-aware TopK+FP16 (auto-selects by link type: WiFi=200x, Ethernet=20x, TB4=off)
- Framework-Agnostic Core: Communication layer uses numpy — never imports torch/mlx
- Health Monitoring: Thermal, memory, battery, loss trend — composite health score per node
- Rich TUI Dashboard: Real-time cluster topology, training progress, and warnings
Quick Start
pip install macfleet
Join the pool
# On each Mac:
macfleet join
Train a model (Python SDK)
import macfleet
# PyTorch
with macfleet.Pool() as pool:
pool.train(
model=MyModel(),
dataset=my_dataset,
epochs=10,
batch_size=128,
)
# MLX (Apple native)
with macfleet.Pool(engine="mlx") as pool:
pool.train(
model=mlx_model,
dataset=(X, y),
epochs=10,
loss_fn=my_loss_fn,
)
# One-liner
macfleet.train(model=MyModel(), dataset=ds, epochs=10)
# Decorator
@macfleet.distributed(engine="torch")
def my_training():
...
CLI commands
macfleet info # Local hardware profile
macfleet status # Discover pool members on the network
macfleet diagnose # System health check (MPS, thermal, network)
macfleet train # Demo training on synthetic data
macfleet bench # Benchmark compute, network, and allreduce
Architecture
┌─────────────────────────────────────────────────────────────────┐
│ CLI: macfleet join | status | train | bench | info | diagnose │
│ SDK: macfleet.Pool() | macfleet.train() │
├─────────────────────────────────────────────────────────────────┤
│ Training: DataParallel | TrainingLoop | WeightedSampler │
├─────────────────────────────────────────────────────────────────┤
│ Engines: TorchEngine (PyTorch+MPS) | MLXEngine (Apple MLX) │
├─────────────────────────────────────────────────────────────────┤
│ Compression: TopK + FP16 + Adaptive pipeline │
├─────────────────────────────────────────────────────────────────┤
│ Pool: Agent | Registry | Discovery | Scheduler | Heartbeat │
├─────────────────────────────────────────────────────────────────┤
│ Communication: PeerTransport | WireProtocol | Collectives │
├─────────────────────────────────────────────────────────────────┤
│ Monitoring: Thermal | Health | Throughput | Dashboard │
└─────────────────────────────────────────────────────────────────┘
Development
git clone https://github.com/yourusername/MacFleet.git
cd MacFleet
pip install -e ".[dev]"
make test # 268 tests
make bench # compute + network + allreduce benchmarks
make lint # ruff + mypy
Requirements
- Python 3.11+
- macOS with Apple Silicon (M1/M2/M3/M4)
- PyTorch 2.1+ (for torch engine)
- MLX 0.5+ (optional, for mlx engine)
License
MIT
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
macfleet-2.0.0.tar.gz
(60.2 kB
view details)
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
macfleet-2.0.0-py3-none-any.whl
(73.6 kB
view details)
File details
Details for the file macfleet-2.0.0.tar.gz.
File metadata
- Download URL: macfleet-2.0.0.tar.gz
- Upload date:
- Size: 60.2 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.2.0 CPython/3.12.7
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
3ca168dd1057e528bacf3c3e5082c3b79c549111cd479cacb5ecd062eec4e143
|
|
| MD5 |
0013eed2bfcecdd423057b5b09940e83
|
|
| BLAKE2b-256 |
1ff169bc72e1388bcffd094a7d5af902b3f95940f77d64fcf9a38bf9b19ff684
|
File details
Details for the file macfleet-2.0.0-py3-none-any.whl.
File metadata
- Download URL: macfleet-2.0.0-py3-none-any.whl
- Upload date:
- Size: 73.6 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.2.0 CPython/3.12.7
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
b193f29d73bf94903a7607541130f13c91daf49736c7f629310acfa86147bd9d
|
|
| MD5 |
8d76ad93f9bc2e886cc43b215a626062
|
|
| BLAKE2b-256 |
24de8d488982caba406aa316b030fe92d9934f7d0fef96cc6775bd1a80bf68a7
|