Data-driven energy management for electric vehicles using offline reinforcement learning
Abstract Energy management technologies have significant potential to optimize electric vehicle performance and support global energy sustainability. However, despite extensive research, their real-world application remains limited due to reliance on simulations, which often fail to bridge the gap b...
Saved in:
| Main Authors: | , , , , |
|---|---|
| Format: | Article |
| Language: | English |
| Published: |
Nature Portfolio
2025-03-01
|
| Series: | Nature Communications |
| Online Access: | https://doi.org/10.1038/s41467-025-58192-9 |
| Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
| _version_ | 1849390424629706752 |
|---|---|
| author | Yong Wang Jingda Wu Hongwen He Zhongbao Wei Fengchun Sun |
| author_facet | Yong Wang Jingda Wu Hongwen He Zhongbao Wei Fengchun Sun |
| author_sort | Yong Wang |
| collection | DOAJ |
| description | Abstract Energy management technologies have significant potential to optimize electric vehicle performance and support global energy sustainability. However, despite extensive research, their real-world application remains limited due to reliance on simulations, which often fail to bridge the gap between theory and practice. This study introduces a real-world data-driven energy management framework based on offline reinforcement learning. By leveraging electric vehicle operation data, the proposed approach eliminates the need for manually designed rules or reliance on high-fidelity simulations. It integrates seamlessly into existing frameworks, enhancing performance after deployment. The method is tested on fuel cell electric vehicles, optimizing energy consumption and reducing system degradation. Real-world data from an electric vehicle monitoring system in China validate its effectiveness. The results demonstrate that the proposed method consistently achieves superior performance under diverse conditions. Notably, with increasing data availability, performance improves significantly, from 88% to 98.6% of the theoretical optimum after two updates. Training on over 60 million kilometers of data enables the learning agent to generalize across previously unseen and corner-case scenarios. These findings highlight the potential of data-driven methods to enhance energy efficiency and vehicle longevity through large-scale vehicle data utilization. |
| format | Article |
| id | doaj-art-602a5cac6c9a4236830b9468c0ff77dc |
| institution | Kabale University |
| issn | 2041-1723 |
| language | English |
| publishDate | 2025-03-01 |
| publisher | Nature Portfolio |
| record_format | Article |
| series | Nature Communications |
| spelling | doaj-art-602a5cac6c9a4236830b9468c0ff77dc2025-08-20T03:41:40ZengNature PortfolioNature Communications2041-17232025-03-0116111610.1038/s41467-025-58192-9Data-driven energy management for electric vehicles using offline reinforcement learningYong Wang0Jingda Wu1Hongwen He2Zhongbao Wei3Fengchun Sun4School of Mechanical Engineering, Beijing Institute of TechnologySchool of Mechanical Engineering, Beijing Institute of TechnologySchool of Mechanical Engineering, Beijing Institute of TechnologySchool of Mechanical Engineering, Beijing Institute of TechnologySchool of Mechanical Engineering, Beijing Institute of TechnologyAbstract Energy management technologies have significant potential to optimize electric vehicle performance and support global energy sustainability. However, despite extensive research, their real-world application remains limited due to reliance on simulations, which often fail to bridge the gap between theory and practice. This study introduces a real-world data-driven energy management framework based on offline reinforcement learning. By leveraging electric vehicle operation data, the proposed approach eliminates the need for manually designed rules or reliance on high-fidelity simulations. It integrates seamlessly into existing frameworks, enhancing performance after deployment. The method is tested on fuel cell electric vehicles, optimizing energy consumption and reducing system degradation. Real-world data from an electric vehicle monitoring system in China validate its effectiveness. The results demonstrate that the proposed method consistently achieves superior performance under diverse conditions. Notably, with increasing data availability, performance improves significantly, from 88% to 98.6% of the theoretical optimum after two updates. Training on over 60 million kilometers of data enables the learning agent to generalize across previously unseen and corner-case scenarios. These findings highlight the potential of data-driven methods to enhance energy efficiency and vehicle longevity through large-scale vehicle data utilization.https://doi.org/10.1038/s41467-025-58192-9 |
| spellingShingle | Yong Wang Jingda Wu Hongwen He Zhongbao Wei Fengchun Sun Data-driven energy management for electric vehicles using offline reinforcement learning Nature Communications |
| title | Data-driven energy management for electric vehicles using offline reinforcement learning |
| title_full | Data-driven energy management for electric vehicles using offline reinforcement learning |
| title_fullStr | Data-driven energy management for electric vehicles using offline reinforcement learning |
| title_full_unstemmed | Data-driven energy management for electric vehicles using offline reinforcement learning |
| title_short | Data-driven energy management for electric vehicles using offline reinforcement learning |
| title_sort | data driven energy management for electric vehicles using offline reinforcement learning |
| url | https://doi.org/10.1038/s41467-025-58192-9 |
| work_keys_str_mv | AT yongwang datadrivenenergymanagementforelectricvehiclesusingofflinereinforcementlearning AT jingdawu datadrivenenergymanagementforelectricvehiclesusingofflinereinforcementlearning AT hongwenhe datadrivenenergymanagementforelectricvehiclesusingofflinereinforcementlearning AT zhongbaowei datadrivenenergymanagementforelectricvehiclesusingofflinereinforcementlearning AT fengchunsun datadrivenenergymanagementforelectricvehiclesusingofflinereinforcementlearning |