Mobile Edge Computing (MEC) is one of the promising solutions for delay-sensitive emerging applications. There are multiple available options to provide wireless access and computing service for users in the dense deployment of MEC-enabled small base stations (SBSs). It makes the mobility management (MM) more complicated. To this, we study the MM problem during the users’ movement in the ultra- dense edge computing scenario to minimize the delay with handover cost as a penalty term of the offloading tasks. In this paper, we propose an online learning optimization scheme based on reinforcement learning to optimize handover decision-making by predicting the upcoming future information. Simulation results show that the proposed scheme can effectively reduce the average delay of users’ computing tasks and the handover rate compared with the available conventional handover schemes.
Mobility Management for Ultra-Dense Edge Computing: A Reinforcement Learning Approach
01.09.2019
312755 byte
Aufsatz (Konferenz)
Elektronische Ressource
Englisch