Deep reinforcement learning (DRL) can be used to solve decision-making problems in changing and complex environments. It is widely envisioned to have great potential for enabling autonomous driving. However, the large and continuous state and action spaces of autonomous driving tasks in general lead to low exploration efficiency and significantly affect the model training speed. In this paper, we propose a novel autonomous driving framework based on DRL with an intervention module. The module utilizes historical information to predict the status of future states so that evaluation of potential actions can be inferred. By this means, an intrinsic reward is generated to prevent DRL agent from entering poor states with useless exploration. Integrating our DRL model with a perception module which utilizes supervised multi-task learning to extract useful features from raw sensor data, the proposed framework can notably improve the efficiency of DRL training. A case study on the open racing car simulator (TORCS), with inputs from both distance sensors and vision sensors, is carried out to demonstrate the effectiveness of our method.
Deep Reinforcement Learning with Intervention Module for Autonomous Driving
2022-09-01
1222569 byte
Conference paper
Electronic Resource
English
Autonomous Driving with Deep Reinforcement Learning
SLUB | 2023
|EVALUATION OF DEEP REINFORCEMENT LEARNING ALGORITHMS FOR AUTONOMOUS DRIVING
British Library Conference Proceedings | 2020
|