Please use this identifier to cite or link to this item:
https://hdl.handle.net/10356/178365
Title: | Deep reinforcement learning-based energy-efficient decision-making for autonomous electric vehicle in dynamic traffic environments | Authors: | Wu, Jingda Song, Ziyou Lv, Chen |
Keywords: | Engineering | Issue Date: | 2023 | Source: | Wu, J., Song, Z. & Lv, C. (2023). Deep reinforcement learning-based energy-efficient decision-making for autonomous electric vehicle in dynamic traffic environments. IEEE Transactions On Transportation Electrification, 10(1), 875-887. https://dx.doi.org/10.1109/TTE.2023.3290069 | Project: | A2084c0156 NTU-SUG |
Journal: | IEEE Transactions on Transportation Electrification | Abstract: | Autonomous driving techniques are promising for improving the energy efficiency of electrified vehicles (EVs) by adjusting driving decisions and optimizing energy requirements. Conventional energy-efficient autonomous driving methods resort to longitudinal velocity planning and fixed-route scenes, which are not sufficient to achieve optimality. In this article, a novel decision-making strategy is proposed for autonomous EVs (AEVs) to maximize energy efficiency by simultaneously considering lane-change and car-following behaviors. Leveraging the deep reinforcement learning (RL) algorithm, the proposed strategy processes complex state information of visual spatial–temporal topology and physical variables to better comprehend surrounding environments. A rule-based safety checker system is developed and integrated downstream of the RL decision-making module to improve lane-change safety. The proposed strategy is trained and evaluated in dynamic driving scenarios with interactive surrounding traffic participants. Simulation results demonstrate that the proposed strategy remarkably improves the EV’s energy economy over state-of-the-art techniques without compromising driving safety or traffic efficiency. Moreover, the results suggest that integrating visual state variables into the RL decision-making strategy is more effective at saving energy in complicated traffic situations. | URI: | https://hdl.handle.net/10356/178365 | ISSN: | 2332-7782 | DOI: | 10.1109/TTE.2023.3290069 | Schools: | School of Mechanical and Aerospace Engineering | Rights: | © 2023 IEEE. All rights reserved. This article may be downloaded for personal use only. Any other use requires prior permission of the copyright holder. The Version of Record is available online at http://doi.org/10.1109/TTE.2023.3290069. | Fulltext Permission: | open | Fulltext Availability: | With Fulltext |
Appears in Collections: | MAE Journal Articles |
Files in This Item:
File | Description | Size | Format | |
---|---|---|---|---|
Energy_efficient_RL_decision_making.pdf | 2.35 MB | Adobe PDF | ![]() View/Open |
SCOPUSTM
Citations
20
19
Updated on Mar 25, 2025
Page view(s)
112
Updated on May 2, 2025
Download(s) 50
163
Updated on May 2, 2025
Google ScholarTM
Check
Altmetric
Items in DR-NTU are protected by copyright, with all rights reserved, unless otherwise indicated.