As a novel paradigm, Vehicular Edge Computing (VEC) can effectively support computation-intensive or delay-sensitive applications in the Internet of Vehicles era. Computation offloading and resource management strategies are key technologies that directly determine the system cost in VEC networks. However, due to vehicle mobility and stochastic arrival computation tasks, designing an optimal offloading and resource allocation policy is extremely challenging. To solve this issue, a deep reinforcement learning-based intelligent offloading and power allocation scheme is proposed for minimizing the total delay cost and energy consumption in dynamic heterogeneous VEC networks. Specifically, we first construct an end-edge-cloud offloading model in a bidirectional road scenario, taking into account stochastic task arrival, time-varying channel conditions, and vehicle mobility. With the objective of minimizing the long-term total cost composed of the energy consumption and task delay, the Markov Decision Process (MDP) can be employed to solve such optimization problems. Moreover, considering the high-dimensional continuity of the action space and the dynamics of task generation, we propose a deep deterministic policy gradient-based adaptive computation offloading and power allocation (DDPG-ACOPA) algorithm to solve the formulated MDP problem. Extensive simulation results demonstrate that the proposed DDPG-ACOPA algorithm performs better in the dynamic heterogeneous VEC environment, significantly outperforming the other four baseline schemes.
Deep Reinforcement Learning-Based Adaptive Computation Offloading and Power Allocation in Vehicular Edge Computing Networks
IEEE Transactions on Intelligent Transportation Systems ; 25 , 10 ; 13339-13349
01.10.2024
6149170 byte
Aufsatz (Zeitschrift)
Elektronische Ressource
Englisch
Deep Reinforcement Learning Based Computation Offloading in UAV-Assisted Edge Computing
DOAJ | 2023
|A Reinforcement Learning Based Task Offloading Scheme for Vehicular Edge Computing Network
Springer Verlag | 2019
|