Mobile edge computing (MEC) and energy harvesting (EH) are two potential techniques to enhance data processing performance and extend the life time of wireless devices. In this paper, we investigate a joint computation offload and resource allocation problem, with multiple user equipment (UE) equipped with EH devices and rechargeable batteries. The objective of the problem is to minimize the system energy consumption while satisfying UE’s delay constraint in the long term. Formulated as an intractable mixed integer nonlinear programming (MINLP), we divide the problem into three phases. To obtain continuous power allocation, we firstly apply a deep reinforcement learning framework called Deep Deterministic Policy Gradient (DDPG). And then the Karush-Kuhn-Tucher (KKT) condition and Lagrangian function are used to get channel assignment. Finally we update the state, action and reward of DDPG framework. The simulation result shows that comparing with other algorithms, our proposed algorithm finds offloading decisions and power allocation with lowest energy consumption.
DDPG Based Computation Offloading and Resource Allocation for MEC Systems with Energy Harvesting
01.04.2021
2991182 byte
Aufsatz (Konferenz)
Elektronische Ressource
Englisch
DOAJ | 2024
|