Toggle Main Menu Toggle Search

Open Access padlockePrints

Reinforcement Learning for EV Fleet Smart Charging with On-Site Renewable Energy Sources

Lookup NU author(s): Dr Saleh AliORCiD

Downloads


Licence

This work is licensed under a Creative Commons Attribution 4.0 International License (CC BY 4.0).


Abstract

In 2020, the transportation sector was the second largest source of carbon emissions in the UK and in Newcastle upon Tyne, responsible for about 33% of total emissions. To support the UK’s target of reaching net zero emissions by 2050, electric vehicles (EVs) are pivotal in advancing carbon-neutral road transportation. Optimal EV charging requires a better understanding of the unpredictable output from on-site renewable energy sources (ORES). This paper proposes an integrated EV fleet charging schedule with a proximal policy optimization method based on a framework for deep reinforcement learning. For the design of the reinforcement learning environment, mathematical models of wind and solar power generation are created. In addition, the multivariate Gaussian distributions derived from historical weather and EV fleet charging data are utilized to simulate weather and charging demand uncertainty in order to create large datasets for training the model. The optimization problem is expressed as a Markov decision process (MDP) with operational constraints. For training artificial neural networks (ANNs) through successive transition simulations, a proximal policy optimization (PPO) approach is devised. The optimization approach is deployed and evaluated on a real-world scenario comprised of council EV fleet charging data from Leicester, UK. The results show that due to the design of the rewards function and system limitations, the charging action is biased towards the time of day when renewable energy output is maximum (midday). The charging decision by reinforcement learning improves the utilization of renewable energy by 2–4% compared to the random charging policy and the priority charging policy. This study contributes to the reduction in battery charging and discharging, electricity sold to the grid to create benefits and the reduction in carbon emissions.


Publication metadata

Author(s): Li H, Dai X, Goldrick S, Kotter R, Aslam N, Ali S

Publication type: Article

Publication status: Published

Journal: energies

Year: 2024

Volume: 17

Issue: 21

Print publication date: 01/11/2024

Online publication date: 31/10/2024

Acceptance date: 08/10/2024

Date deposited: 31/10/2024

ISSN (electronic): 1996-1073

Publisher: MDPI AG

URL: https://doi.org/10.3390/en17215442

DOI: 10.3390/en17215442

Data Access Statement: The data that support the findings of this study are available from the corresponding author upon reasonable request and the algorithm of this research can be found in https://github.com/handongli2019/Reinforcement-learning-for-Microgrid-management (accessed on 1 October 2024).


Altmetrics

Altmetrics provided by Altmetric


Funding

Funder referenceFunder name
EPSRC project Electric Fleets with On-site Renewable Energy Sources (EFORES) under grant EP/W028727/1
EU Interreg North Sea Region programme’s SEEV4-City (Smart, clean Energy and Electric Vehicles for the City) project (J-No.: 38-2-23-15)
Wuhan AI Innovation Program (2022010702040056)

Share