Version 1
: Received: 1 September 2021 / Approved: 2 September 2021 / Online: 2 September 2021 (11:36:48 CEST)
Version 2
: Received: 15 September 2021 / Approved: 16 September 2021 / Online: 16 September 2021 (11:47:31 CEST)
Version 3
: Received: 11 January 2022 / Approved: 12 January 2022 / Online: 12 January 2022 (10:22:47 CET)
How to cite:
Nor, A.K.M.; Pedapati, S.R.; Muhammad, M.; Leiva, V. Explainable Artificial Intelligence for Anomaly Detection and Prognostic of Gas Turbines using Uncertainty Quantification with Sensor-Related Data. Preprints2021, 2021090034. https://doi.org/10.20944/preprints202109.0034.v2
Nor, A.K.M.; Pedapati, S.R.; Muhammad, M.; Leiva, V. Explainable Artificial Intelligence for Anomaly Detection and Prognostic of Gas Turbines using Uncertainty Quantification with Sensor-Related Data. Preprints 2021, 2021090034. https://doi.org/10.20944/preprints202109.0034.v2
Nor, A.K.M.; Pedapati, S.R.; Muhammad, M.; Leiva, V. Explainable Artificial Intelligence for Anomaly Detection and Prognostic of Gas Turbines using Uncertainty Quantification with Sensor-Related Data. Preprints2021, 2021090034. https://doi.org/10.20944/preprints202109.0034.v2
APA Style
Nor, A.K.M., Pedapati, S.R., Muhammad, M., & Leiva, V. (2021). Explainable Artificial Intelligence for Anomaly Detection and Prognostic of Gas Turbines using Uncertainty Quantification with Sensor-Related Data. Preprints. https://doi.org/10.20944/preprints202109.0034.v2
Chicago/Turabian Style
Nor, A.K.M., Masdi Muhammad and Víctor Leiva. 2021 "Explainable Artificial Intelligence for Anomaly Detection and Prognostic of Gas Turbines using Uncertainty Quantification with Sensor-Related Data" Preprints. https://doi.org/10.20944/preprints202109.0034.v2
Abstract
Explainable artificial intelligence (XAI) is in its assimilation phase in the prognostic and health management (PHM). The literature on PHM-XAI is deficient with respect to metrics of uncertainty quantification and explanation evaluation. This paper proposes a new method of anomaly detection and prognostic for gas turbines using Bayesian deep learning and Shapley additive explanations (SHAP). The method explains the anomaly detection and prognostic and improves the performance of the prognostic, aspects that have not been considered in the literature of PHM-XAI. The uncertainty measures considered serve to broaden explanation scope and can also be exploited as anomaly indicators. Real-world gas turbine sensor-related data are tested for the anomaly detection, while NASA commercial modular aero-propulsion system simulation data, related to turbofan sensors, were used for prognostic. The generated explanation is evaluated using two metrics: consistency and local accuracy. All anomalies were successfully detected using the uncertainty indicators. Meanwhile, the turbofan prognostic results showed up to 9% improvement in root mean square error and 43% enhancement in early prognostic due to the SHAP, making it comparable to the best existing methods. The XAI and uncertainty quantification offer a comprehensive explanation for assisting decision-making. Additionally, the SHAP ability to increase PHM performance confirms its value in AI-based reliability research.
Keywords
Artificial intelligence; CMAPSS; consistency and local accuracy; CUSUM chart; deep learning; prognostic and health management; RMSE; sensing and data extraction; SHAP; Uncertainty; XAI
Subject
Computer Science and Mathematics, Artificial Intelligence and Machine Learning
Copyright:
This is an open access article distributed under the Creative Commons Attribution License which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.