OrbitalBrain: A Distributed Framework For Training ML Models in Space

New Ideas in Networked Systems (NINeS) |

Earth observation nanosatellites capture high-resolution photos of the Earth in near real-time. These images increasingly support ML applications that are critical for safety and response, such as forest fire and flood detection. However, the downlink bandwidth is limited, resulting in days or weeks of delay from image capture to training. In this work, we propose OrbitalBrain, an efficient in-space distributed ML training framework that leverages limited and predictable satellite compute, bandwidth, and power to intelligently balance data transfer, model aggregation, and local training. Our evaluations demonstrate that OrbitalBrain achieves 1.52×-12.4× speedup in time-to-accuracy while always reaching a higher final model accuracy compared to state-of-the-art ground-based or federated learning baselines. Furthermore, our approach is complementary to satellite imagery capturing and downloading, enhancing the overall efficiency of satellite-based applications

GitHubGitHub