Mobile Edge Computing (MEC) is one of the promising solutions for delay-sensitive emerging applications. There are multiple available options to provide wireless access and computing service for users in the dense deployment of MEC-enabled small base stations (SBSs). It makes the mobility management (MM) more complicated. To this, we study the MM problem during the users’ movement in the ultra- dense edge computing scenario to minimize the delay with handover cost as a penalty term of the offloading tasks. In this paper, we propose an online learning optimization scheme based on reinforcement learning to optimize handover decision-making by predicting the upcoming future information. Simulation results show that the proposed scheme can effectively reduce the average delay of users’ computing tasks and the handover rate compared with the available conventional handover schemes.


    Zugriff

    Zugriff prüfen

    Verfügbarkeit in meiner Bibliothek prüfen

    Bestellung bei Subito €


    Exportieren, teilen und zitieren



    Titel :

    Mobility Management for Ultra-Dense Edge Computing: A Reinforcement Learning Approach


    Beteiligte:
    Zhang, Haibin (Autor:in) / Wang, Rong (Autor:in) / Liu, Jiajia (Autor:in)


    Erscheinungsdatum :

    01.09.2019


    Format / Umfang :

    312755 byte





    Medientyp :

    Aufsatz (Konferenz)


    Format :

    Elektronische Ressource


    Sprache :

    Englisch