Preprint Article Version 1 Preserved in Portico This version is not peer-reviewed

A Two-Stage Attention Based Hierarchical Transformer for Remaining Useful Life Prediction

Version 1 : Received: 28 December 2023 / Approved: 29 December 2023 / Online: 29 December 2023 (03:18:35 CET)

A peer-reviewed article of this Preprint also exists.

Fan, Z.; Li, W.; Chang, K.-C. A Two-Stage Attention-Based Hierarchical Transformer for Turbofan Engine Remaining Useful Life Prediction. Sensors 2024, 24, 824. Fan, Z.; Li, W.; Chang, K.-C. A Two-Stage Attention-Based Hierarchical Transformer for Turbofan Engine Remaining Useful Life Prediction. Sensors 2024, 24, 824.

Abstract

Accurate estimation of Remaining Useful Life (RUL) for aircraft engines is essential for ensuring safety and uninterrupted operations in the aviation industry. Numerous investigations have leveraged the success of attention-based Transformer architecture in sequence modeling tasks, particularly in its application to RUL prediction. These studies primarily focus on utilizing onboard sensor readings as input predictors. While various Transformer-based approaches have demonstrated improvement in RUL predictions, their exclusive focus on temporal attention within multivariate time series sensor readings, without considering sensor-wise attention, raises concerns about potential inaccuracies in RUL predictions. To address this concern, our paper proposes a novel solution in the form of a two-stage attention based hierarchical transformer (STAR) frame-work. This approach incorporates a two-stage attention mechanism, systematically addressing both temporal and sensor-wise attentions. Furthermore, we enhance the STAR RUL prediction framework by integrate hierarchical encoder-decoder structures to capture valuable information across different time scales. By conducting extensive numerical experiments with the CMAPSS datasets, we demonstrate that our proposed STAR framework significantly outperforms current state-of-the-art models for RUL prediction.

Keywords

two-stage attention; multiscale transformer; remaining useful life prediction; turbofan aircraft engine

Subject

Computer Science and Mathematics, Applied Mathematics

Comments (0)

We encourage comments and feedback from a broad range of readers. See criteria for comments and our Diversity statement.

Leave a public comment
Send a private comment to the author(s)
* All users must log in before leaving a comment
Views 0
Downloads 0
Comments 0
Metrics 0


×
Alerts
Notify me about updates to this article or when a peer-reviewed version is published.
We use cookies on our website to ensure you get the best experience.
Read more about our cookies here.