Energy management strategies (EMSs) are pivotal in optimizing energy efficiency for vehicles equipped with hybrid electric powertrains. In spite of the growing adoption of deep reinforcement learning (DRL)-based approaches, challenges persist in achieving satisfactory optimization performance and maintaining reliable control. Motivated by this, this article introduces a novel trainable equivalent consumption minimization strategy (ECMS) framework for fuel cell hybrid electric tracked vehicles (FCHETVs) with uncertainty-aware control. First, the proposed framework employs a DRL algorithm to dynamically determine and optimize the equivalent factor in the ECMS method, facilitating improved fuel economy. Then, the soft actor-critic (SAC) algorithm is formulated for efficient policy learning. In order to further enhance control reliability, an ensembled policy network method is incorporated to measure uncertainty and mitigate suboptimal actions, thereby improving decision-making robustness. Simulation results reveal that the SAC-based trainable ECMS achieves significant fuel economy improvements, outperforming the traditional SAC and adaptive ECMS (A-ECMS) methods by 2.93% and 5.15%, respectively. The ensemble model, moreover, ensures reliable and effective control, with online testing results indicating an additional 2.08% improvement in fuel economy. These findings underscore the effectiveness of integrating learning-based and optimization-based approaches in EMS design, offering a robust pathway to reducing energy consumption and promoting sustainable transportation solutions.


    Access

    Check access

    Check availability in my library

    Order at Subito €


    Export, share and cite



    Title :

    Uncertainty-Aware Deep Reinforcement Learning for Trainable Equivalent Consumption Minimization Strategy of Fuel Cell Hybrid Electric Tracked Vehicle


    Contributors:
    Su, Qicong (author) / Huang, Ruchen (author) / Zhang, Zhendong (author) / Shou, Yiwen (author) / He, Hongwen (author)

    Published in:

    Publication date :

    2025-08-01


    Size :

    2386846 byte




    Type of media :

    Article (Journal)


    Type of material :

    Electronic Resource


    Language :

    English