Vehicular Edge Computing (VEC) offers a promising framework for providing vehicles with low-latency and highly reliable services. By leveraging the underutilized computational resources of parked and moving vehicles commonly found in urban areas, a VEC system can enhance the performance of surrounding user devices and alleviate the loads on its edge servers. In this study, a resource orchestration scheme is introduced for a multi-device, multi-vehicle, and multi-edge scenario. Tasks from a device can be offloaded to its associated edge server, a neighboring edge server, a parked vehicle, or a moving vehicle. Our goal is to achieve the total task processing cost (comprising task processing latency and energy consumption) minimization across all devices through making strategies for task offloading and computational and communication resource allocation. We decompose the optimization problem and propose a Twin Delayed Deep Deterministic Policy Gradient (TD3)-based Deep Reinforcement Learning (DRL) algorithm. Furthermore, to accelerate the convergence speed of the algorithm, we optimize the uplink transmit power allocation sub-problem separately by designing a numerical algorithm. We analyze the complexity of the algorithm and assess its convergence. Through extensive simulations across 5 different scenarios, our proposed scheme outperforms 4 reference schemes, showcasing reductions in total task processing costs ranging from 15.13% to 38.59%.
DRL-Based Resource Orchestration for Vehicular Edge Computing With Multi-Edge and Multi-Vehicle Assistance
IEEE Transactions on Intelligent Transportation Systems ; 26 , 6 ; 8764-8779
2025-06-01
1727028 byte
Article (Journal)
Electronic Resource
English