Version 1
: Received: 17 December 2023 / Approved: 18 December 2023 / Online: 29 December 2023 (09:17:31 CET)
How to cite:
Cui, Y.; Li, Z.; Zhang, P. Informer Model with Season-Aware Block for Efficient Power Series Forecasting. Preprints2023, 2023121377. https://doi.org/10.20944/preprints202312.1377.v1
Cui, Y.; Li, Z.; Zhang, P. Informer Model with Season-Aware Block for Efficient Power Series Forecasting. Preprints 2023, 2023121377. https://doi.org/10.20944/preprints202312.1377.v1
Cui, Y.; Li, Z.; Zhang, P. Informer Model with Season-Aware Block for Efficient Power Series Forecasting. Preprints2023, 2023121377. https://doi.org/10.20944/preprints202312.1377.v1
APA Style
Cui, Y., Li, Z., & Zhang, P. (2023). Informer Model with Season-Aware Block for Efficient Power Series Forecasting. Preprints. https://doi.org/10.20944/preprints202312.1377.v1
Chicago/Turabian Style
Cui, Y., Zhao Li and Peng Zhang. 2023 "Informer Model with Season-Aware Block for Efficient Power Series Forecasting" Preprints. https://doi.org/10.20944/preprints202312.1377.v1
Abstract
With the development of electricity spot markets, accurate electricity load forecasting enables power generation companies to supply the right amount of electricity, Significantly avoid power waste. As a result, time series forecasting in the field of power can bring significant benefits. Previously, the Informer model successfully introduced the Transformer into long time series forecasting(LTSF) by proposing the ProbSparse self-attention mechanism, which solved the inherent problem of high memory complexity in self-attention. Recent research has further demonstrated the potential of the self-attention for mining complex dependencies. However, the limited amount of historical data has become one of the main challenges in applying deep learning techniques to power LSTF tasks. Previous reserches often add a large number of time covariates to provide more information. In this paper, to address this issue, (i) we design a simple but effective Season-aware Block to enhance the model’s ability to mine artificial prior information in temporal covariates; (ii) we conduct experiments using the provincial power data of Zhejiang Province, China, from 2019 to 2022, and our model outperforms other models, achieving a 19 percent MSE relative improvement; (iii) we conduct ablation experiments to assess the efficacy of the Season-aware Block in extracting temporal periodic features. Furthermore, we elucidate the underlying reasons for the effectiveness of both the self-attention mechanism and the Season-aware Block through visualization experiments.
Keywords
LSTF; self-attention; data mining; temporal covariates
Subject
Computer Science and Mathematics, Artificial Intelligence and Machine Learning
Copyright:
This is an open access article distributed under the Creative Commons Attribution License which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.