Efficient Federated DRL Based Cooperative Caching for Mobile Edge Networks

Efficient Federated DRL Based Cooperative Caching for Mobile Edge Networks

Abstract:

Edge caching has been regarded as a promising technique for low-latency, high-rate data delivery in future networks, and there is an increasing interest to leverage Machine Learning (ML) for better content placement instead of traditional optimization-based methods due to its self-adaptive ability under complex environments. Despite many efforts on ML-based cooperative caching, there are still several key issues that need to be addressed, especially to reduce computation complexity and communication costs under the optimization of cache efficiency. To this end, in this paper, we propose an efficient cooperative caching (FDDL) framework to address the issues in mobile edge networks. Particularly, we propose a DRL-CA algorithm for cache admission, which extracts a boarder set of attributes from massive requests to improve the cache efficiency. Then, we present an lightweight eviction algorithm for fine-grained replacements of unpopular contents. Moreover, we present a Federated Learning-based parameter sharing mechanism to reduce the signaling overheads in collaborations. We implement an emulation system and evaluate the caching performance of the proposed FDDL. Emulation results show that the proposed FDDL can achieve a higher cache hit ratio and traffic offloading rate than several conventional caching policies and DRL-based caching algorithms, and effectively reduce communication costs and training time.