With the rapid advancement of the Internet of Vehicles (IoV), there arises an increasing demand for efficient connectivity and communication mechanisms between vehicles and infrastructures, wherein resource allocation assumes paramount importance. The primary objective of a resource allocation algorithm is to distribute limited resources, including power and spectrum, to users within the network while catering to the diverse requirements of users. In this paper, we introduce a novel approach called the Intrinsic Curiosity Module (ICM) based Double Q Learning (DQL) for resource allocation, denoted as ICM-DQRA, aimed at addressing resource allocation challenges in IoV network. We integrate the ICM into the DQL algorithm to incorporate an intrinsic reward to the agent. This intrinsic reward, absent in most reinforcement learning algorithms, serves to incentivize the agent to explore the environment further and make decisions conducive to better rewards. Through comprehensive simulations, it shows that our proposed method outperforms other approaches, such as the greedy method and DQL method. Specifically, the ICM-DQRA algorithm achieves a more efficient resource allocation result, leading to a substantial reduction in energy consumption across the vehicular network, ranging from 20% to 27%.
Curiosity-Driven Energy-Aware Resource Allocation for Internet of Vehicles
2024-10-07
943642 byte
Conference paper
Electronic Resource
English