Preprint Article Version 1 Preserved in Portico This version is not peer-reviewed

Informer Model with Season-Aware Block for Efficient Power Series Forecasting

Version 1 : Received: 17 December 2023 / Approved: 18 December 2023 / Online: 29 December 2023 (09:17:31 CET)

How to cite: Cui, Y.; Li, Z.; Zhang, P. Informer Model with Season-Aware Block for Efficient Power Series Forecasting. Preprints 2023, 2023121377. https://doi.org/10.20944/preprints202312.1377.v1 Cui, Y.; Li, Z.; Zhang, P. Informer Model with Season-Aware Block for Efficient Power Series Forecasting. Preprints 2023, 2023121377. https://doi.org/10.20944/preprints202312.1377.v1

Abstract

With the development of electricity spot markets, accurate electricity load forecasting enables power generation companies to supply the right amount of electricity, Significantly avoid power waste. As a result, time series forecasting in the field of power can bring significant benefits. Previously, the Informer model successfully introduced the Transformer into long time series forecasting(LTSF) by proposing the ProbSparse self-attention mechanism, which solved the inherent problem of high memory complexity in self-attention. Recent research has further demonstrated the potential of the self-attention for mining complex dependencies. However, the limited amount of historical data has become one of the main challenges in applying deep learning techniques to power LSTF tasks. Previous reserches often add a large number of time covariates to provide more information. In this paper, to address this issue, (i) we design a simple but effective Season-aware Block to enhance the model’s ability to mine artificial prior information in temporal covariates; (ii) we conduct experiments using the provincial power data of Zhejiang Province, China, from 2019 to 2022, and our model outperforms other models, achieving a 19 percent MSE relative improvement; (iii) we conduct ablation experiments to assess the efficacy of the Season-aware Block in extracting temporal periodic features. Furthermore, we elucidate the underlying reasons for the effectiveness of both the self-attention mechanism and the Season-aware Block through visualization experiments.

Keywords

LSTF; self-attention; data mining; temporal covariates

Subject

Computer Science and Mathematics, Artificial Intelligence and Machine Learning

Comments (0)

We encourage comments and feedback from a broad range of readers. See criteria for comments and our Diversity statement.

Leave a public comment
Send a private comment to the author(s)
* All users must log in before leaving a comment
Views 0
Downloads 0
Comments 0
Metrics 0


×
Alerts
Notify me about updates to this article or when a peer-reviewed version is published.
We use cookies on our website to ensure you get the best experience.
Read more about our cookies here.