Efficient Asynchronous Federated Learning Research in the Internet of Vehicles

Efficient Asynchronous Federated Learning Research in the Internet of Vehicles

Abstract:

Federated learning (FL) is a distributed machine learning paradigm that ensures data do not leave local devices. Data sharing problems can be addressed by FL in untrusted environments, e.g., the Internet of Vehicles (IoV). However, FL needs to frequently exchange massive parameters to achieve preset model goals. In addition, the change in bandwidths and the delay of data communications due to user mobility challenge the synchronization of model parameters. In this article, an efficient hierarchical asynchronous FL (EHAFL) algorithm is proposed to adjust the encoding length dynamically according to the bandwidth and reduce the communication cost substantially. A dynamic hierarchical asynchronous aggregation mechanism is proposed leveraging gradient sparsification and asynchronous aggregation techniques to further reduce the communication costs and improve the aggregation efficiency of the global model. Simulation results on MNIST and real-world data sets show that our proposed solution can reduce the communication costs by 98% while only compromising the model accuracy by 1%.