LearningEMS: A New Framework for Electric Vehicle Energy Management
en-GBde-DEes-ESfr-FR

LearningEMS: A New Framework for Electric Vehicle Energy Management

10/04/2025 Frontiers Journals

A new study published in Engineering introduces LearningEMS, a unified framework and open-source benchmark designed to revolutionize the development and assessment of energy management strategies (EMS) for electric vehicles (EVs).

The automotive industry has recently undergone a transformative shift fueled by the growing global emphasis on sustainability and environmental conservation. EVs have become a crucial part of the future of transportation. However, effectively managing the energy in EVs, especially those with complex power systems like battery EVs, hybrid EVs, fuel cell EVs, and plug-in EVs, remains a challenge. An efficient EMS is essential for optimizing the energy efficiency of these vehicles.

LearningEMS provides a general platform that supports various EV configurations. It allows for detailed comparisons of several EMS algorithms, including imitation learning, deep reinforcement learning (RL), offline RL, model predictive control, and dynamic programming. The framework comes with three distinct EV platforms, over 10 000 km of EMS policy dataset, ten state-of-the-art algorithms, and over 160 benchmark tasks, along with three learning libraries.

The researchers rigorously evaluated these algorithms from multiple perspectives, such as energy efficiency, consistency, adaptability, and practicability. For example, in the benchmark test results, they found that discrete action space algorithms like DQN and D3QN perform well in simple EMS tasks but are less efficient when dealing with complex control parameters. On the other hand, off-policy algorithms with continuous action spaces, like DDPG, TD3, and SAC, show great potential in optimizing energy efficiency and maintaining consistency across different driving conditions. The on-policy algorithm PPO, however, exhibits significant performance variations in different vehicles or operational conditions.

The study also delves into important aspects of RL in EV energy management, such as the design of state, reward, and action settings. The researchers discuss how these elements can significantly impact the overall performance of the EMS. Additionally, they introduce a policy extraction and reconstruction method for deploying learning-based EMS onto real-world vehicle controllers and conduct hardware-in-the-loop experiments to prove its feasibility.

According to the researchers, LearningEMS has the potential to improve energy efficiency, reduce vehicle operating costs, and extend the lifespan of power systems. The open-source nature of LearningEMS encourages further research and innovation in the field, allowing engineers and researchers to develop more advanced EMS algorithms.

The paper “LearningEMS: A Unified Framework and Open-source Benchmark for Learning-based Energy Management of Electric Vehicles,” authored by Yong Wang, Hongwen He, Yuankai Wu, Pei Wang, Haoyu Wang, Renzong Lian, Jingda Wu, Qin Li, Xiangfei Meng, Yingjuan Tang, Fengchun Sun, and Amir Khajepour. Full text of the open access paper: https://doi.org/10.1016/j.eng.2024.10.021. For more information about the Engineering, follow us on X (https://twitter.com/EngineeringJrnl) & like us on Facebook (https://www.facebook.com/EngineeringJrnl).
LearningEMS: A Unified Framework and Open-source Benchmark for Learning-based Energy Management of Electric Vehicles
Author: Yong Wang,Hongwen He,Yuankai Wu,Pei Wang,Haoyu Wang,Renzong Lian,Jingda Wu,Qin Li,Xiangfei Meng,Yingjuan Tang,Fengchun Sun,Amir Khajepour
Publication: Engineering
Publisher: Elsevier
Fichiers joints
  • The learning-based EMS framework and system design of LearningEMS. (a) It consists of three layers: the EV environment layer, the learning-based algorithm layer, and the application layer. D3QN: dueling DDQN; CQL: conservative Q-learning; BCQ: batch-Constrained Q-Learning; SB3: stable-Baselines3; RLlib: reinforcement learning library; (b) Training pipeline of LearningEMS: First, choose an EV environment, users can create new environments or add modules to existing ones. Then, select an algorithm and dataset. Finally, start training. After training the policy in simulation, it can be directly deployed into the controller, enabling hardware-in-the-loop (HIL) or vehicle-in-the-loop (VIL) experiments.
10/04/2025 Frontiers Journals
Regions: Asia, China, Extraterrestrial, Sun
Keywords: Applied science, Engineering

Disclaimer: AlphaGalileo is not responsible for the accuracy of content posted to AlphaGalileo by contributing institutions or for the use of any information through the AlphaGalileo system.

Témoignages

We have used AlphaGalileo since its foundation but frankly we need it more than ever now to ensure our research news is heard across Europe, Asia and North America. As one of the UK’s leading research universities we want to continue to work with other outstanding researchers in Europe. AlphaGalileo helps us to continue to bring our research story to them and the rest of the world.
Peter Dunn, Director of Press and Media Relations at the University of Warwick
AlphaGalileo has helped us more than double our reach at SciDev.Net. The service has enabled our journalists around the world to reach the mainstream media with articles about the impact of science on people in low- and middle-income countries, leading to big increases in the number of SciDev.Net articles that have been republished.
Ben Deighton, SciDevNet
AlphaGalileo is a great source of global research news. I use it regularly.
Robert Lee Hotz, LA Times

Nous travaillons en étroite collaboration avec...


  • e
  • The Research Council of Norway
  • SciDevNet
  • Swiss National Science Foundation
  • iesResearch
Copyright 2025 by DNN Corp Terms Of Use Privacy Statement