Big tech companies make no secret of the fact that they use artificial intelligence to track your browsing behavior and offer products and services that seem to target your interests and needs. While that helps them offer personalized services, there is the risk of sharing your data with a third party. That raises the specter of compromised digital data. In contrast, Federated Learning (FL) systems are gaining credence because they compute all updates on mobile devices, thus keeping the data local. The drawback of standard FL systems, however, is their unsuitability for applications that require frequent updates online, such as news recommendations.

In a major step forward for FL systems, researchers from EPFL and INRIA have developed the first online FL system called FLeet, which makes it possible to carry out machine learning on mobile devices in real-time without any impact on learning tasks. More importantly, data is not shared with any tech company but remains local and secure. The research recently won the best conference paper at the 2020 ACM/IFIP Middleware Conference.

FLeet manages to deliver the best of both worlds—privacy offered by standard FL systems and the precision of online learning. Helping it do so are two main components: I-Prof, a new lightweight profiler that predicts and controls the impact of learning tasks on mobile devices; and AdaSGD, a new adaptive learning algorithm that is resilient to delayed updates.

Test results have shown that FLeet can deliver a 2.3x quality boost compared to Standard FL, while only consuming 0.036% of the battery per day. I-Prof improves the prediction accuracy up to 3.6× (computation time) and up to 19× (energy), while AdaSGD outperforms alternative FL approaches by 18.4% in terms of convergence speed on heterogeneous data.

Rachid Guerraoui and Anne-Marie Kermarrec, professors at EPFL’s School of Computer and Communication Science and authors in the study, emphasize that today’s smartphones have the power to enable distributed machine learning without having to share raw data with, or rely on, large centralized systems.

As Professor Guerraoui explains, “With FLeet it is possible, while you are using your mobile phone, to use some of its spare power towards machine learning tasks without having to worry that your call or internet search will be interrupted…. we don’t want the machine learning to only be happening when you are asleep and your phone is on charge…sometimes we want, and need, real-time information.” Professor Kermarrec elucidates how the findings of their study can foster “truly collaborative learning where local models are aggregated and contribute to the global model but you don’t share raw data and that protects your privacy….”

The researchers are currently exploring options to develop the FLeet prototype into a safe, secure, and usable end product.

“FLeet: Online Federated Learning via Staleness Awareness and Performance Prediction”, authored by Georgios Damaskinos (FaceBook) Rachid Guerraoui, Anne-Marie Kermarrec (EPFL), Rhicheek Patra (Oracle Labs), Francois Taiani (INRIA/University of Rennes), and Vlad Nitu (CNRS)