Learning-based algorithms play a pivotal role in various functional modules of an autonomous driving system. Recognizing and accounting for the impact of learning-based algorithm uncertainties on other functional modules can be crucial for making more dependable driving behavior decisions and for selecting more appropriate driving precaution measures, as opposed to directly executing safety fallback strategies like emergency braking. With the motivation of optimizing the safety without unnecessary disruption to the driving experience, this paper proposes an uncertainty-aware, dual-tiered decision making method named DBNID, which is based on dynamic Bayesian network (DBN) and influence diagram (ID). To begin, the paper formulates the effects of uncertainty propagation stemming from perception and prediction modules using a DBN model. The effects are then solved by an expectation maximum (EM) algorithm. Furthermore, how the uncertainty propagation effects are considered in the decision making process is then presented in an ID model with the introduction of the utility function formulation. Finally, the proposed DBNID method is evaluated on a simulation platform tailored for real-world autonomous driving testing. By considering uncertainty propagation, the results demonstrate that the proposed method can significantly reduce the likelihood of violating critical safe stop requirements, while simultaneously enhancing the minimum time-to-collision (TTC) performance. DBNID method offers valuable insights of integrating learning-based algorithm uncertainties into autonomous vehicle decision making process.
An Uncertainty-Aware, Dual-Tiered Decision-Making Method for Safe Autonomous Driving
IEEE Transactions on Intelligent Transportation Systems ; 26 , 1 ; 691-702
2025-01-01
1849914 byte
Article (Journal)
Electronic Resource
English
HIGHLY HUMANOID SAFE-DRIVING DECISION-MAKING METHOD FOR AUTONOMOUS COMMERCIAL VEHICLE
European Patent Office | 2023
|Delay-aware Robust Control for Safe Autonomous Driving
IEEE | 2022
|