To satisfy the enormous data transmission demands, caching popular files into memories at user terminals (UTs) is a promising solution, which can alleviate the heavy burden on backhaul links and shorten the transmission delay. In this paper, we study the user caching strategy by exploiting the effect of user mobility. The contact of mobile users is modeled as the Poisson process. Both Zipf distributed and uniform distributed file demands are considered in the caching strategy. To improve user's quality of experience (QoE) for delay sensitive services, we define the user satisfaction metric in terms of the delay time, and maximize it through the proposed caching placement strategy. The mixed equality-inequality constrained optimization problem is solved by the multiplier penalty function (MPF) method. Numerical results reveal that the maximal average user satisfaction is achieved when the file caching coordinates with the file demand.


    Access

    Check access

    Check availability in my library

    Order at Subito €


    Export, share and cite



    Title :

    Mobility-Aware User Caching Strategy with QoE Maximization


    Contributors:
    Teng, Yinglei (author) / Lu, Guofeng (author) / Sun, Weiqi (author) / Ma, Yue (author)


    Publication date :

    2017-06-01


    Size :

    195145 byte




    Type of media :

    Conference paper


    Type of material :

    Electronic Resource


    Language :

    English




    Mobility-Aware Proactive Edge Caching for Connected Vehicles Using Federated Learning

    Yu, Zhengxin / Hu, Jia / Min, Geyong et al. | IEEE | 2021



    Caching Strategy Considerations

    Do, Sydney / Delfa, Juan / Spencer, David | NTRS | 2021