Swarm-Tune
In ProgressA decentralized AI fine-tuning swarm — nodes discover each other over libp2p, exchange raw PyTorch gradients across the mesh, average them without any central server, and collectively fine-tune a model inside a Docker-simulated network on a single machine.
- Phase 1: Whisper Network — libp2p P2P node discovery & messaging, no central server
- Phase 2: Gradient Math — manual PyTorch gradient extraction, bypassing DDP for network transit
- Phase 3: Synchronisation — nodes exchange & average gradients across libp2p; straggler-problem handling
- Phase 4: Docker Simulation — 4-container swarm on M4 Pro collectively fine-tuning a model
