Deep Q Learning Based Dynamic Network Slicing and Task Offloading in Edge Network

Deep Q Learning Based Dynamic Network Slicing and Task Offloading in Edge Network

Abstract:

Recently, Edge Computing (EC) has become a promising enabler to support emerging applications in 5G mobile networks by offloading compute-intensive tasks from devices to proximate EC servers. Meanwhile, Network Slicing (NS) aims to provide service subscribers (SSs) with dedicated network resources based on virtualization techniques so that the service requirements can be guaranteed. The combination of EC and NS can efficiently utilize dynamic network resources at edge networks while improving the Quality of Service (QoS) of SSs. In this paper, we aim to jointly address the problem of dynamic slice scaling and task offloading from the perspective of profit of service providers (SPs) in the multi-tenant EC system. Specifically, we propose a Deep Q-Learning (DQL) based network slicing framework to dynamically reconfigure the scale of radio and computing resources of a slice reserved for a target SP. Then, by exploiting alternative optimization, we proposed a low-complexity algorithm to optimize the real-time offloading ratio and resource allocation policy of slice requests from SSs. To further verify our proposed framework, we have implemented the network slicing testbed with Docker container and conducted a series of experiments based on a real-world traffic dataset and a sample Augmented Reality (AR) application.