Grey wolf optimizer-based machine learning algorithm to predict electric vehicle charging duration time

Irfan Ullah, Kai Liu*, Toshiyuki Yamamoto, Md Shafiullah, Arshad Jamal

*Corresponding author for this work

Research output: Contribution to journalArticlepeer-review

52 Scopus citations

Abstract

Precise charging time prediction can effectively mitigate the inconvenience to drivers induced by inevitable charging behavior throughout trips. Although the effectiveness of the machine learning (ML) algorithm in predicting future outcomes has been established in a variety of applications (transportation sector), the investigation into electric vehicle (EV) charging time prediction is almost new. This calls for the investigation of the ML algorithm to predict EV charging time. The study developed an EV charging time prediction model based on two years of charging event data collected from 500 EVs in Japan. To predict EV charging time, this paper employed three ML algorithms: extreme learning machine (ELM), feed-forward neural network (FFNN), and support vector regression (SVR). Furthermore, ML algorithms parameters are optimized by a metaheuristic techniques: the gray wolf optimizer (GWO), particle swarm optimizer (PSO), and genetic algorithm (GA) to achieve higher accuracy and robustness. The prediction results reveal that GWO-based ML models yielded better results compared to other models.

Original languageEnglish
Pages (from-to)889-906
Number of pages18
JournalTransportation Letters
Volume15
Issue number8
DOIs
StatePublished - 2023

Bibliographical note

Publisher Copyright:
© 2022 Informa UK Limited, trading as Taylor & Francis Group.

Keywords

  • Electric vehicles (EVs)
  • Real-world data
  • charging time
  • gray wolf optimizer (GWO)
  • machine learning (ML) algorithm

ASJC Scopus subject areas

  • Transportation

Fingerprint

Dive into the research topics of 'Grey wolf optimizer-based machine learning algorithm to predict electric vehicle charging duration time'. Together they form a unique fingerprint.

Cite this