Submitted:
28 November 2024
Posted:
29 November 2024
You are already at the latest version
Abstract
Keywords:
1. Introduction
- Novel Data Augmentation Framework: We introduce a GAN-based framework for synthesizing multivariate time-series data in EV driving scenarios, addressing a critical gap in synthetic data generation. This enables cost-effective, scalable, and diverse dataset creation.
- Advanced Model Architecture: TS-p2pGAN incorporates an integrated transformation network and multiscale discriminator to handle high-dimensional, extended time sequences. By redefining time-series regeneration as a data generation task, it produces high-quality synthetic data that enriches existing datasets.
- Comprehensive Evaluation: We implement a robust evaluation protocol using multiple quantitative and qualitative tools, providing in-depth insights into synthetic data quality and supporting iterative improvements in generative models.
- Real-World Validation: The model is rigorously validated using 38 real-world driving trips, demonstrating strong generalization capabilities and confirming its practical applicability in realistic driving scenarios.
2. Materials and Methods
2.1. Dataset
- Vehicle dynamics: speed, altitude, throttle position, motor torque, and longitudinal acceleration
- Battery metrics: power battery voltage, current, temperature, actual SOC, and displayed SOC
- Climate control: heater wattage demand, air conditioner power consumption, heater voltage, and current
- Environmental conditions: ambient temperature and related parameters
2.2.1. Architecture of the Transformation Net Model in the Generator Framework:
2.2.2. Multiscale Discriminators
3. Experiments and Results
3.1. Data Preprocessing
3.2. Quantitative and Qualitative Performance Metrics
3.3.1. Real and Synthetic Data
4. Conclusions
Author Contributions
Funding
Institutional Review Board Statement
Informed Consent Statement
Data Availability Statement
Conflicts of Interest
References
- Chen, Z.; He, T.; Mao, Y.; Zhu, W.; Xiong, Y.; Wang, S.; Zeng, J.; Xu, Q.; Niu, Y. State of Charge Estimation Method of Energy Storage Battery Based on Multiple Incremental Features. J. Electrochem. Soc. 2024, 171, 070522. [CrossRef]
- Zhou, W.; Zheng, Y.; Pan, Z.; Lu, Q. Review on the Battery Model and SOC Estimation Method. Processes 2021, 9, 1685. [CrossRef]
- Meng, J.; Ricco, M.; Luo, G.; Swierczynski, M.; Stroe, D.-I.; Stroe, A.-I.; Teodorescu, R. An Overview and Comparison of Online Implementable SOC Estimation Methods for Lithium-Ion Battery. IEEE Trans. Ind. Appl. 2017, 54, 1583–1591. [CrossRef]
- Yu, Q.; Xiong, R.; Lin, C. Online Estimation of State-of-charge Based on the H infinity and Unscented Kalman Filters for Lithium Ion Batteries. Energy Procedia 2017, 105, 2791–2796. [CrossRef]
- Sarda, J.; Patel, H.; Popat, Y.; Hui, K.L.; Sain, M. Review of Management System and State-of-Charge Estimation Methods for Electric Vehicles. World Electr. Veh. J. 2023, 14, 325. [CrossRef]
- D., O.P.; Babu, P.S.; V., I.; B., A.; S., V.; C., K. Enhanced SOC estimation of lithium ion batteries with RealTime data using machine learning algorithms. Sci. Rep. 2024, 14, 1–17. [CrossRef]
- Khan, U.; Kirmani, S.; Rafat, Y.; Rehman, M.U.; Alam, M.S. Improved deep learning based state of charge estimation of lithium ion battery for electrified transportation. J. Energy Storage 2024, 91. [CrossRef]
- Selvaraj, V.; Vairavasundaram, I. A Bayesian optimized machine learning approach for accurate state of charge estimation of lithium ion batteries used for electric vehicle application. J. Energy Storage 2024, 86. [CrossRef]
- Iglesias, G.; Talavera, E.; González-Prieto, .; Mozo, A.; Gómez-Canaval, S. Data Augmentation techniques in time series domain: a survey and taxonomy. Neural Comput. Appl. 2023, 35, 10123–10145. [CrossRef]
- Wen, Q.; Sun, L.; Yang, F.; Song, X.; Gao, J.; Wang, X.; Xu, H. Time Series Data Augmentation for Deep Learning: A Survey. Thirtieth International Joint Conference on Artificial Intelligence {IJCAI-21}. LOCATION OF CONFERENCE, CanadaDATE OF CONFERENCE; pp. 4653–4660.
- Lee, B. T.; Kwon, J. M.; Jo, Y. Y. TADA: Temporal Adversarial Data Augmentation for Time Series Data. arXiv 2024, arXiv:2407.15174.
- Victor, A.O.; Ali, M.I. Enhancing Time Series Data Predictions: A Survey of Augmentation Techniques and Model Performances. ACSW 2024: 2024 Australasian Computer Science Week. Australia; pp. 1–13.
- Yoo, Y.; Lee, J. Designable Data Augmentation-based Domain-adaptive Design of Electric Vehicle Considering Dynamic Responses. Int. J. Precis. Eng. Manuf. Technol. 2024, 2, 23–32. [CrossRef]
- Chakraborty, T.; S, U.R.K.; Naik, S.M.; Panja, M.; Manvitha, B. Ten years of generative adversarial nets (GANs): a survey of the state-of-the-art. Mach. Learn. Sci. Technol. 2024, 5, 011001. [CrossRef]
- Rayavarapu, S.M.; Tammineni, S.P.; Gottapu, S.R.; Singam, A. A REVIEW OF GENERATIVE ADVERSARIAL NETWORKS FOR SECURITY APPLICATIONS. Informatics, Control. Meas. Econ. Environ. Prot. 2024, 14, 66–70. [CrossRef]
- Sabnam, S.; Rajagopal, S. Application of generative adversarial networks in image, face reconstruction and medical imaging: challenges and the current progress. Comput. Methods Biomech. Biomed. Eng. Imaging Vis. 2024, 12. [CrossRef]
- He, R.Y.; Sarwal, V.; Qiu, X.; Zhuang, Y.; Zhang, L.; Liu, Y.; Chiang, J.N. Generative AI models in time varying biomedical data: a systematic review (Preprint). J. Med Internet Res. 2024. [CrossRef]
- Gui, J.; Sun, Z.; Wen, Y.; Tao, D.; Ye, J. A Review on Generative Adversarial Networks: Algorithms, Theory, and Applications. IEEE Trans. Knowl. Data Eng. 2021, 35, 3313–3332. [CrossRef]
- Brophy, E.; Wang, Z.; She, Q.; Ward, T. Generative Adversarial Networks in Time Series: A Systematic Literature Review. ACM Comput. Surv. 2023, 55, 1–31. [CrossRef]
- Pooyandeh, M.; Sohn, I. Smart Lithium-Ion Battery Monitoring in Electric Vehicles: An AI-Empowered Digital Twin Approach. Mathematics 2023, 11, 4865. [CrossRef]
- Pawar, D. R.; Yannawar, P. Advancements and Applications of Generative Adversarial Networks: A Comprehensive Review. Int. J. Res. Appl. Sci. Eng. Technol. 2024. Available online:. [CrossRef]
- Megahed, M.; Mohammed, A. A comprehensive review of generative adversarial networks: Fundamentals, applications, and challenges. WIREs Comput. Stat. 2023, 16. [CrossRef]
- Srivastava, P.; Yadav, U.; Ranjan, R.; Kumar, J.D. Emerging Trends in Generative Adversarial Networks: An Analysis of Recent Advances and Future Directions. International Conference on Cutting-Edge Developments in Engineering Technology and Science.
- Wong, K.L.; Chou, K.S.; Tse, R.; Tang, S.-K.; Pau, G. A Novel Fusion Approach Consisting of GAN and State-of-Charge Estimator for Synthetic Battery Operation Data Generation. Electronics 2023, 12, 657. [CrossRef]
- Hu, C.; Cheng, F.; Zhao, Y.; Guo, S.; Ma, L. State of charge estimation for lithium-ion batteries based on data augmentation with generative adversarial network. J. Energy Storage 2024, 80. [CrossRef]
- Juneja, T.; Bajaj, S. B.; Sethi, N. Synthetic Time Series Data Generation Using Time GAN with Synthetic and Real-Time Data Analysis. In Proceedings of the International Conference on Recent Innovations in Computing; Springer Nature: Singapore, 2022; pp. 657–667.
- Gu, X.; See, K.; Liu, Y.; Arshad, B.; Zhao, L.; Wang, Y. A time-series Wasserstein GAN method for state-of-charge estimation of lithium-ion batteries. J. Power Sources 2023, 581. [CrossRef]
- Soo, Y. Y.; Wang, Y.; Xiang, H.; Chen, Z. A Data Augmentation Method for Lithium-Ion Battery Capacity Estimation Based on Wasserstein Time Generative Adversarial Network. Energy Technol. 2024, 2400488.
- Klopries, H.; Schwung, A. ITF-GAN: Synthetic time series dataset generation and manipulation by interpretable features. Knowledge-Based Syst. 2023, 283. [CrossRef]
- Zhang, C.; Zhang, Y.; Li, Z.; Zhang, Z.; Nazir, M.S.; Peng, T. Enhancing state of charge and state of energy estimation in Lithium-ion batteries based on a TimesNet model with Gaussian data augmentation and error correction. Appl. Energy 2024, 359. [CrossRef]
- Isola, P.; Zhu, J. Y.; Zhou, T.; Efros, A. A. Image-to-Image Translation with Conditional Adversarial Networks. In Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition; IEEE: 2017; pp. 1125–1134.
- Wang, T. C.; Liu, M. Y.; Zhu, J. Y.; Tao, A.; Kautz, J.; Catanzaro, B. High-Resolution Image Synthesis and Semantic Manipulation with Conditional GANs. In Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition; IEEE: 2018; pp. 8798–8807.
- Chen, W.; Xu, X.; Luo, J.; Zhou, W. Ambient-Pix2PixGAN for translating medical images from noisy data. Image Perception, Observer Performance, and Technology Assessment. United States; pp. 83–89.
- Li, Z.; Guan, B.; Wei, Y.; Zhou, Y.; Zhang, J.; Xu, J. Mapping New Realities: Ground Truth Image Creation with Pix2Pix Image-to-Image Translation. arXiv 2024, arXiv:2404.19265.
- Battery and Heating Data for Real Driving Cycles. IEEE DataPort 2022. Available online: https://ieee-dataport.org/open-access/battery-and-heating-data-real-driving-cycles (accessed on 17 November 2024).
- Van der Maaten, L.; Hinton, G. Visualizing Data Using t-SNE. J. Mach. Learn. Res. 2008, 9, 2579–2605.
- Abdi, H.; Williams, L. J. Principal Component Analysis. Wiley Interdiscip. Rev. Comput. Stat. 2010, 2, 433–459.







| Name | Layer | (k, s, p) | f | d | Module |
| Input | 10 | 256 | Front-end () Down-sampling operations |
||
| Reflect_1 | Pad1d | (-, -, 3) | 10 | 262 | |
| Conv_1 | Conv1dN+R | (7, 1, 0) | 64 | 256 | |
| Conv_2 | Conv1dN+R | (3, 2, 1) | 128 | 128 | |
| Conv_3 | Conv1dN+R | (3, 2, 1) | 256 | 64 | |
| Conv_4 | Conv1dN+R | (3, 2, 1) | 512 | 32 | |
| Conv_5 | Conv1dN+R | (3, 2, 1) | 1024 | 16 | |
| ResnetBlock_1 | Pad1d Conv1dN+R Pad1d Conv1dN+R |
(-, -, 1) (3, 1, 1) (-, -, 1) (3, 1, 1) |
1024 1024 1024 1024 |
16 18 16 16 |
Residual blocks () |
| ResnetBlock_2 | Pad1d Conv1dN+R Pad1d Conv1dN+R |
(-, -, 1) (3, 1, 1) (-, -, 1) (3, 1, 1) |
1024 1024 1024 1024 |
16 18 16 16 |
|
| ResnetBlock_3 | Pad1d Conv1dN+R Pad1d Conv1dN+R |
(-, -, 1) (3, 1, 1) (-, -, 1) (3, 1, 1) |
1024 1024 1024 1024 |
16 18 16 16 |
|
| ResnetBlock_4 | Pad1d Conv1dN+R Pad1d Conv1dN+R |
(-, -, 1) (3, 1, 1) (-, -, 1) (3, 1, 1) |
1024 1024 1024 1024 |
16 18 16 16 |
|
| ResnetBlock_5 | Pad1d Conv1dN+R Pad1d Conv1dN+R |
(-, -, 1) (3, 1, 1) (-, -, 1) (3, 1, 1) |
1024 1024 1024 1024 |
16 18 16 16 |
|
| ResnetBlock_6 | Pad1d Conv1dN+R Pad1d Conv1dN+R |
(-, -, 1) (3, 1, 1) (-, -, 1) (3, 1, 1) |
1024 1024 1024 1024 |
16 18 16 16 |
|
| ResnetBlock_7 | Pad1d Conv1dN+R Pad1d Conv1dN+R |
(-, -, 1) (3, 1, 1) (-, -, 1) (3, 1, 1) |
1024 1024 1024 1024 |
16 18 16 16 |
|
| ResnetBlock_8 | Pad1d Conv1dN+R Pad1d Conv1dN+R |
(-, -, 1) (3, 1, 1) (-, -, 1) (3, 1, 1) |
1024 1024 1024 1024 |
16 18 16 16 |
|
| ResnetBlock_9 | Pad1d Conv1dN+R Pad1d Conv1dN+R |
(-, -, 1) (3, 1, 1) (-, -, 1) (3, 1, 1) |
1024 1024 1024 1024 |
16 18 16 16 |
|
| ConvTran_1 | ConvTranspose1dN+R | (3, 2, 1) | 512 | 32 | Back-end (). Up-sampling operations |
| ConvTran_2 | ConvTranspose1dN+R | (3, 2, 1) | 256 | 64 | |
| ConvTran_3 | ConvTranspose1dN+R | (3, 2, 1) | 128 | 128 | |
| ConvTran_4 | ConvTranspose1dN+R | (3, 2, 1) | 64 | 256 | |
| Reflect_2 | Pad1D | (-, -, 3) | 64 | 262 | |
| Conv_6 | Conv1D | (7, 1, 0) | 4 | 256 | |
| Tanh_1 | Tanh | 4 | 256 |
| Name | Layer | (k, s, p) | f | d | Scale |
| Input | 14 | 256 | scale 1 | ||
| Conv_1 | Conv1dR | (4, 2, 2) | 64 | 129 | |
| Conv_2 | Conv1dN+R | (4, 2, 2) | 128 | 65 | |
| Conv_3 | Conv1dN+R | (4, 2, 2) | 256 | 33 | |
| Conv_4 | Conv1dN+R | (4, 1, 2) | 512 | 34 | |
| Conv_5 | Conv1d | (4, 1, 2) | 1 | 35 | |
| Sigmoid_1 | Sigmoid | 1 | 35 | ||
| Input | 14 | 256 | scale 2 | ||
| Pool1D_1 | AvgPool1D | (3, 2, 1) | 14 | 128 | |
| Conv_6 | Conv1dR | (4, 2, 2) | 64 | 65 | |
| Conv_7 | Conv1dN+R | (4, 2, 2) | 128 | 33 | |
| Conv_8 | Conv1dN+R | (4, 2, 2) | 256 | 17 | |
| Conv_9 | Conv1dN+R | (4, 1, 2) | 512 | 18 | |
| Conv_10 | Conv1d | (4, 1, 2) | 1 | 19 | |
| Sigmoid_2 | Sigmoid | 1 | 35 |
| Trip no | RMSE(%) | MAE (%) | Trip no | RMSE(%) | MAE (%) |
| 1 | 1.833 (3.666) | 1.005 (2.599) | 2 | 1.792 (3.942) | 0.980 (2.813) |
| 3 | 1.584 (5.353) | 0.899 (3.827) | 4 | 1.903 (31.478) | 0.783 (18.560) |
| 5 | 1.931 (7.450) | 0.863 (5.339) | 6 | 1.664 (7.935) | 0.905 (5.798) |
| 7 | 1.591 (4.788) | 0.845 (3.444) | 8 | 1.330 (5.812) | 0.699 (4.099) |
| 9 | 1.603 (25.776) | 0.821 (15.632) | 10 | 1.887 (6.890) | 0.979 (5.181) |
| 11 | 2.775 (7.943) | 1.085 (6.119) | 12 | 1.277 (11.272) | 0.696 (8.178) |
| 13 | 2.715 (6.361) | 1.411 (3.888) | 14 | 1.439 (6.962) | 0.808 (4.947) |
| 15 | 1.688 (7.298) | 0.933 (5.006) | 16 | 1.764 (6.756) | 0.987 (4.982) |
| 17 | 1.980 | 1.050 | 18 | 1.449 | 0.749 |
| 19 | 1.828 | 1.063 | 20 | 1.825 | 0.918 |
| 21 | 1.819 | 1.019 | 22 | 2.069 | 1.035 |
| 23 | 2.544 | 1.223 | 24 | 2.131 | 1.133 |
| 25 | 2.076 | 1.020 | 26 | 2.965 | 1.235 |
| 27 | 1.895 | 1.008 | 28 | 2.100 | 1.118 |
| 29 | 3.183 | 1.739 | 30 | 2.300 | 1.023 |
| 31 | 1.789 | 0.887 | 32 | 2.394 | 1.185 |
| 33 | 2.177 | 1.184 | 34 | 2.910 | 1.241 |
| 35 | 1.944 | 0.977 | 36 | 1.744 | 0.841 |
| 37 | 1.609 | 0.851 | 38 | 2.256 | 1.091 |
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2024 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (http://creativecommons.org/licenses/by/4.0/).
