Submitted:
16 January 2024
Posted:
17 January 2024
You are already at the latest version
Abstract
Keywords:
I. INTRODUCTION
- A.
- TRANSFORMER COOLING SYSTEM METHODS
- 1)
- DRY-TYPE TRANSFORMERS
- 2)
- OIL-TYPE TRANSFORMERS
- B.
- MONITORING OF COOLING SYSTEMS
- 1)
- OFFLINE AND ONLINE MONITORING
- 2)
- OVERVIEW OF ONLINE COOLING SYSTEM MONITORING TECHNIQUES
- 3)
- AF TRANSFORMER COOLING FAN MONITORING
- C.
- ONTRIBUTIONS
- We introduce machine and deep learning methods to address the gap in the literature regarding malfunction detection in AF transformer cooling fans.
- Acoustic data are used to detect malfunctions in transformer cooling fans. Utilizing simple microphones to collect audio signals, the study provides a novel approach to online monitoring of cooling fan conditions. This introduces alternative data types and methods that would help operational decision-making.
- We also develop a comprehensive optimization process for classifier threshold optimization to enhance the performance of the learning algorithms for fault detection in transformer cooling systems.
II. METHODOLOGY
- A.
- DATA COLLECTION AND PREPROCESSING
- 1)
- DATA AUGMENTATION
- Calculating the power of the original audio signal:where N is the length of the original audio signal and represents the power of the original audio signal, respectively.
- Select a random Signal-to-Noise Ratio (SNR) within a range from 7 to 20 dB [30].
- Calculating of the desired noise power:where indicates the desired noise power.
- Generating white noise by sampling from a normal distribution with mean 0 and standard deviation of 1.
- Scaling the white noise to match the desired noise power:where ,, and indicates scale factor, variance of the white noise, and scaled white noise, respectively.
- Adding the scaled white noise to the original audio signal:where is the injected audio signal with the white noise.
- B.
- FEATURE EXTRACTION AND SPECTROGRAM REPRESENTATION
- 1)
- FEATURE EXTRACTION TECHNIQUES
- RMS quantifies the amplitude of a signal over a certain time period and represents the overall energy of the signal as follows:where K is the number of samples in a window.
- ZCR counts how many times the signal crosses the zero-amplitude line as follows:
-
Kurtosis factor quantifies the extent to which a distribution is heavy-tailed or light-tailed relative to a normal distribution and could be expressed as follows:A positive kurtosis suggests heavier tails, a negative kurtosis suggests lighter tails and a kurtosis of 0 indicates normal tails.
- The shape factor provides insight into the duration and relative proportions of the positive and negative peaks in a signal amplitude by describing its shape or waveform as follows:
- The crest factor measures the ratio between the peak amplitude and RMS as follows:
- The impulse factor is used to characterize the impulsiveness or transient nature of a signal. It provides information about sudden changes or impulses within a signal and could be expressed as follows:
- Besides, statistical measures such as the mean, variance, minimum, and maximum of signals were calculated for each window.
- Spectral centroid calculates the location of the mass center of a spectrum providing an estimate of the dominant frequency in the signal and could be expressed as follows [36]:Where, is the frequency of bin i in the FFT of the window.
- The Spectral bandwidth is the standard deviation of the distribution of spectral components around the spectral centroid as follows [37]:
- Spectral flatness measures the uniformity of the power spectrums of a signal in the frequency distribution. Mathematically, spectral flatness is the ratio of the geometric mean to the arithmetic mean of the power spectrums [38]:
- Spectral flux represents the rate of change in the spectral content of a signal providing information about how quickly the frequency energy distribution of the signals change. The spectral flux is the 2-norm of the difference between the magnitude spectra of consecutive frames [39].
- Peak frequency can be determined by identifying the frequency value associated with the maximum power in the signals [40].
- 2)
- SPECTROGRAM REPRESENTATION TECHNIQUE
- Using a windowing technique, audio signals were divided into short overlapping segments.
- FFT was applied to each segment to transform the audio signal from the time domain to the frequency domain.
- The power spectrum was calculated by taking the squared magnitude of the Fourier transform.
- By plotting the power spectrum values as a heat map or a grayscale image, the spectrograms are created. The X-axis represents time, the Y-axis indicates frequency, and the intensity or color of each pixel indicates the magnitude or power of the corresponding frequency component.
- C.
- DATA NORMALIZATION TECHNIQUES
- D.
- LEARNING ALGORITHMS
- E.
- EVALUATION METRICS
- Accuracy: it measures the overall correctness of the model predictions. It calculates the ratio of correctly classified samples to the total number of samples.
- Recall: it measures the ability of the models to correctly identify all instances of cooling fan malfunctions, providing insights into their capacity to minimize cases where malfunctioning cooling fans are not detected.
- F1 score: It balances the trade-off between the ability to correctly classify malfunctioning predictions and recall the ability to capture all malfunctioning instances.
- F.
- TECHNIQUES TO IMPROVE LEARNING MODELS
- 1)
- HYPERPARAMETER TUNING
- 2)
- LEARNING CURVE
- 3)
- ADJUSTING THE CLASSIFICATION THRESHOLD
- G.
- FEATURE IMPORTANCE
III. RESULTS AND DISCUSSION
- A.
- DATA PREPROCESSING AND AUGMENTATION
- B.
- FEATURE EXTRACTION AND SPECTROGRAM REPRESENTATION
- 1)
- FEATURE EXTRACTION RESULTS
- 2)
- SPECTROGRAM REPRESENTATION
- C.
- CNN CLASSIFICATION
| Hyperparameters | Values |
| Number of convolutional layers | 3 |
| Number of dense layers | 2 (fully connected) |
| Kernel size of the first layers | 3×3 |
| Number of neurons in the first dense layer | 256 |
| Number of neurons in the second dense layer | 8 |
| Activation function of the layers | Relue and sigmoid |
| Pooling size after the first layer | 2×2 |
| Pooling size after the second layer | 2×2 |
| Dropout rate | 0.5 |
| Batch size | 32 |
| Loss function | Categorical cross-entropy |
| Learning rate | 0.001 |
| Learning rule | Adam |
- 1)
- CNN LEARNING CURVE ANALYSIS
- 2)
- CNN OPTIMAL CLASSIFIER THRESHOLD ANALYSIS
| Threshold | Metric | Training | Test |
| 0.44 | Accuracy | 0.94 | 0.97 |
| Recall | 0.96 | 0.96 | |
| F1 score | 0.95 | 0.98 | |
| 0.5 | Accuracy | 0.89 | 0.92 |
| Recall | 0.9 | 0.89 | |
| F1 score | 0.92 | 0.94 |
- 3)
- DATA AUGMENTATION ANALYSIS
- 4)
- RANDOM FOREST CLASSIFICATION
| Training sample size | Metric | Training | Test | |
| 75% | Accuracy | 0.90 | 0.83 | |
| Recall | 0.88 | 0.85 | ||
| F1 score | 0.92 | 0.88 | ||
| 50% | Accuracy | 0.77 | 0.73 | |
| Recall | 0.73 | 0.66 | ||
| F1 score | 0.8 | 0.79 | ||
| Non-augmented | Accuracy | 0.58 | 0.54 | |
| Recall | 0.48 | 0.42 | ||
| F1 score | 0.63 | 0.54 |
| Hyperparameters | Values |
| n_estimators | 200 |
| max_depth | 50 |
| min_samples_split | 20 |
| min_samples_leaf | 1 |
| max_features | Auto |
| min_impurity_decrease | 0.0 |
| bootstrap | True |
| class_weight | 2×2 |
| ccp_alpha | 0.0 |
| criterion | entropy |
| warm_start | True |
| Learning rate | 0.001 |
| Learning rule | Adam |
- 1)
- RF LEARNING CURVE ANALYSIS
- 2)
- RF OPTIMAL CLASSIFIER THRESHOLD ANALYSIS
| Threshold | Metric | Training | Test |
| 0.39 | Accuracy | 0.93 | 0.91 |
| Recall | 0.93 | 0.91 | |
| F1 score | 0.93 | 0.92 | |
| 0.5 | Accuracy | 0.89 | 0.92 |
| Recall | 0.9 | 0.89 | |
| F1 score | 0.92 | 0.94 |
- 3)
- FEATURE IMPORTANCE ANALYSIS
IV. CONCLUSION
- Considering the extracted important features in this study, explore different feature extraction techniques to improve random forest performance.
- Investigating ensemble learning methods, including the combination of CNNs and random forest algorithms.
- Analyzing the effect of varying audio signal lengths on the detection of malfunctions.
- Evaluation of different data augmentation strategies, such as ensemble methods.
- Incorporating multi-modal sensor data, including vibration and temperature sensors, to develop a more comprehensive system.
References
- R. Abbasi, “Fault detection and diagnosis in power transformers: a comprehensive review and classification of publications and methods,” Electric Power Systems Research, vol. 209, p. 107990, Aug. 2022. [CrossRef]
- Hackl, J. Kullick, and N. Monzen, “Generic loss minimization for nonlinear synchronous machines by analytical computation of optimal reference currents considering copper and iron losses,” in 2021 22nd IEEE International Conference on Industrial Technology (ICIT), IEEE, Mar. 2021, pp. 1348–1355. [CrossRef]
- L. Raeisian, H. Niazmand, E. Ebrahimnia-Bajestan, and P. Werle, “Thermal management of a distribution transformer: An optimization study of the cooling system using CFD and response surface methodology,” International Journal of Electrical Power & Energy Systems, vol. 104, pp. 443–455, Jan. 2019. [CrossRef]
- Lei, S. Bu, Q. Wang, N. Zhou, L. Yang, and X. Xiong, “Load Transfer Optimization Considering Hot-Spot and Top-Oil Temperature Limits of Transformers,” IEEE Transactions on Power Delivery, vol. 37, no. 3, pp. 2194–2208, Jun. 2022. [CrossRef]
- H. Amiri, “Analysis and comparison of actual behavior of oil-type and dry-type transformers during lightning,” in 2021 25th Electrical Power Distribution Conference (EPDC), IEEE, Aug. 2021, pp. 1–4. [CrossRef]
- M. S. Mahdi et al., “Effect of fin geometry on natural convection heat transfer in electrical distribution transformer: Numerical study and experimental validation,” Thermal Science and Engineering Progress, vol. 14, p. 100414, Dec. 2019. [CrossRef]
- M. Ngo, Y. Cao, D. Dong, R. Burgos, K. Nguyen, and A. Ismail, “Forced Air-Cooling Thermal Design Methodology for High-Density, High-Frequency, and High-Power Planar Transformers in 1U Applications,” IEEE J Emerg Sel Top Power Electron, vol. 11, no. 2, pp. 2015–2028, Apr. 2023. [CrossRef]
- Wang et al., “A New Testing Method for the Dielectric Response of Oil-Immersed Transformer,” IEEE Transactions on Industrial Electronics, vol. 67, no. 12, pp. 10833–10843, Dec. 2020. [CrossRef]
- S. Zhao, Q. Liu, M. Wilkinson, G. Wilson, and Z. Wang, “A Reduced Radiator Model for Simplification of ONAN Transformer CFD Simulation,” IEEE Transactions on Power Delivery, vol. 37, no. 5, pp. 4007–4018, Oct. 2022. [CrossRef]
- S. Tenbohlen, S. Coenen, M. Djamali, A. Müller, M. Samimi, and M. Siegel, “Diagnostic Measurements for Power Transformers,” Energies (Basel), vol. 9, no. 5, p. 347, May 2016. [CrossRef]
- W. Zhan, A. E. Goulart, M. Falahi, and P. Rondla, “Development of a Low-Cost Self-Diagnostic Module for Oil-Immerse Forced-Air Cooling Transformers,” IEEE Transactions on Power Delivery, vol. 30, no. 1, pp. 129–137, Feb. 2015. [CrossRef]
- L. Wang, W. Zuo, Z.-X. Yang, J. Zhang, and Z. Cai, “A Method for Fans’ Potential Malfunction Detection of ONAF Transformer Using Top-Oil Temperature Monitoring,” IEEE Access, vol. 9, pp. 129881–129889, 2021. [CrossRef]
- V. Shiravand, J. Faiz, M. H. Samimi, and M. Mehrabi-Kermani, “Prediction of transformer fault in cooling system using combining advanced thermal model and thermography,” IET Generation, Transmission & Distribution, vol. 15, no. 13, pp. 1972–1983, Jul. 2021. [CrossRef]
- Q. Zhang, Q. Zhou, Z. Lu, Z. Wei, L. Xu, and Y. Gui, “Recent Advances of SnO2-Based Sensors for Detecting Fault Characteristic Gases Extracted From Power Transformer Oil,” Front Chem, vol. 6, Aug. 2018. [CrossRef]
- Xiao-hui Cheng and Yang Wang, “The remote monitoring system of transformer fault based on The internet of Things,” in Proceedings of 2011 International Conference on Computer Science and Network Technology, IEEE, Dec. 2011, pp. 84–87. [CrossRef]
- Zou, R. Deng, Q. Mei, and L. Zou, “Fault diagnosis of a transformer based on polynomial neural networks,” Cluster Comput, vol. 22, no. S4, pp. 9941–9949, Jul. 2019. [CrossRef]
- L. Wang, W. Zuo, Z.-X. Yang, J. Zhang, and Z. Cai, “A Method for Fans’ Potential Malfunction Detection of ONAF Transformer Using Top-Oil Temperature Monitoring,” IEEE Access, vol. 9, pp. 129881–129889, 2021. [CrossRef]
- M. Djamali and S. Tenbohlen, “Malfunction Detection of the Cooling System in Air-Forced Power Transformers Using Online Thermal Monitoring,” IEEE Transactions on Power Delivery, vol. 32, no. 2, pp. 1058–1067, Apr. 2017. [CrossRef]
- H. Zhang, G. Liu, B. Lin, H. Deng, Y. Li, and P. Wang, “Thermal evaluation optimization analysis for non-rated load oil-natural air-natural transformer with auxiliary cooling equipment,” IET Generation, Transmission & Distribution, vol. 16, no. 15, pp. 3080–3091, Aug. 2022. [CrossRef]
- M. Djamali and S. Tenbohlen, “A validated online algorithm for detection of fan failures in oil-immersed power transformers,” International Journal of Thermal Sciences, vol. 116, pp. 224–233, Jun. 2017. [CrossRef]
- J. Picaut, A. Can, N. Fortin, J. Ardouin, and M. Lagrange, “Low-Cost Sensors for Urban Noise Monitoring Networks—A Literature Review,” Sensors, vol. 20, no. 8, p. 2256, Apr. 2020. [CrossRef]
- P. Thanapol, K. Lavangnananda, P. Bouvry, F. Pinel, and F. Leprevost, “Reducing Overfitting and Improving Generalization in Training Convolutional Neural Network (CNN) under Limited Sample Sizes in Image Recognition,” in 2020 - 5th International Conference on Information Technology (InCIT), IEEE, Oct. 2020, pp. 300–305. [CrossRef]
- Shorten and T. M. Khoshgoftaar, “A survey on Image Data Augmentation for Deep Learning,” J Big Data, vol. 6, no. 1, p. 60, Dec. 2019. [CrossRef]
- S. Park et al., “SpecAugment: A Simple Data Augmentation Method for Automatic Speech Recognition,” in Interspeech 2019, ISCA: ISCA, Sep. 2019, pp. 2613–2617. [CrossRef]
- L. K. Shahidi, L. M. Collins, and B. O. Mainsah, “Parameter tuning of time-frequency masking algorithms for reverberant artifact removal within the cochlear implant stimulus,” Cochlear Implants Int, vol. 23, no. 6, pp. 309–316, Nov. 2022. [CrossRef]
- Abeysinghe, S. Tohmuang, J. L. Davy, and M. Fard, “Data augmentation on convolutional neural networks to classify mechanical noise,” Applied Acoustics, vol. 203, p. 109209, Feb. 2023. [CrossRef]
- Q. Wang, J. Q. Wang, J. Du, H.-X. Wu, J. Pan, F. Ma, and C.-H. Lee, “A Four-Stage Data Augmentation Approach to ResNet-Conformer Based Acoustic Modeling for Sound Event Localization and Detection,” IEEE/ACM Trans Audio Speech Lang Process, vol. 31, pp. 1251–1264, 2023. [CrossRef]
- M. Goubeaud, P. Jousen, N. Gmyrek, F. Ghorban, and A. Kummert, “White Noise Windows: Data Augmentation for Time Series,” in 2021 7th International Conference on Optimization and Applications (ICOA), IEEE, May 2021, pp. 1–5. [CrossRef]
- J. Chen, W. Yi, and D. Wang, “Filter Bank Sinc-ShallowNet with EMD-based Mixed Noise Adding Data Augmentation for Motor Imagery Classification,” in 2021 43rd Annual International Conference of the IEEE Engineering in Medicine & Biology Society (EMBC), IEEE, Nov. 2021, pp. 5837–5841. [CrossRef]
- N. Yalta, S. Watanabe, T. Hori, K. Nakadai, and T. Ogata, “CNN-based Multichannel End-to-End Speech Recognition for Everyday Home Environments,” in 2019 27th European Signal Processing Conference (EUSIPCO), IEEE, Sep. 2019, pp. 1–5. [CrossRef]
- C.-I. Lai, Y.-S. Chuang, H.-Y. Lee, S.-W. Li, and J. Glass, “Semi-Supervised Spoken Language Understanding via Self-Supervised Speech and Language Model Pretraining,” in ICASSP 2021 - 2021 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP), IEEE, Jun. 2021, pp. 7468–7472. [CrossRef]
- S. Khalid, T. Khalil, and S. Nasreen, “A survey of feature selection and feature extraction techniques in machine learning,” in 2014 Science and Information Conference, IEEE, Aug. 2014, pp. 372–378. [CrossRef]
- K. M. M. Prabhu, Window Functions and Their Applications in Signal Processing. Boca Raton: CRC Press, 2018. [CrossRef]
- J. Chen, B. Xu, and X. Zhang, “A Vibration Feature Extraction Method Based on Time-Domain Dimensional Parameters and Mahalanobis Distance,” Math Probl Eng, vol. 2021, pp. 1–12, Jul. 2021. [CrossRef]
- G. Sharma, K. Umapathy, and S. Krishnan, “Trends in audio signal feature extraction methods,” Applied Acoustics, vol. 158, p. 107020, Jan. 2020. [CrossRef]
- S. Kavitha and J. Manikandan, “Improved Methodology of SVM to Classify Acoustic Signal by Spectral Centroid,” Journal of Trends in Computer Science and Smart Technology, vol. 3, no. 4, pp. 294–304, May 2022. [CrossRef]
- M. Lagrange and F. Gontier, “Bandwidth Extension of Musical Audio Signals With No Side Information Using Dilated Convolutional Neural Networks,” in ICASSP 2020 - 2020 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP), IEEE, May 2020, pp. 801–805. [CrossRef]
- J. Herre, E. Allamanche, and O. Hellmuth, “Robust matching of audio signals using spectral flatness features,” in Proceedings of the 2001 IEEE Workshop on the Applications of Signal Processing to Audio and Acoustics (Cat. No.01TH8575), IEEE, 2001, pp. 127–130. [CrossRef]
- J. T. Geiger, B. Schuller, and G. Rigoll, “Large-scale audio feature extraction and SVM for acoustic scene classification,” in 2013 IEEE Workshop on Applications of Signal Processing to Audio and Acoustics, IEEE, Oct. 2013, pp. 1–4. [CrossRef]
- X. Zhang, Z. Su, P. Lin, Q. He, and J. Yang, “An audio feature extraction scheme based on spectral decomposition,” in 2014 International Conference on Audio, Language and Image Processing, IEEE, Jul. 2014, pp. 730–733. [CrossRef]
- S. Hershey et al., “CNN architectures for large-scale audio classification,” in 2017 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP), IEEE, Mar. 2017, pp. 131–135. [CrossRef]
- M. Dorfler, R. Bammer, and T. Grill, “Inside the spectrogram: Convolutional Neural Networks in audio processing,” in 2017 International Conference on Sampling Theory and Applications (SampTA), IEEE, Jul. 2017, pp. 152–155. [CrossRef]
- R. Nematirad, A. Pahwa, B. Natarajan, and H. Wu, “Optimal sizing of photovoltaic-battery system for peak demand reduction using statistical models,” Front Energy Res, vol. 11, Dec. 2023. [CrossRef]
- M. A. Siddiqi and W. Pak, “An Agile Approach to Identify Single and Hybrid Normalization for Enhancing Machine Learning-Based Network Intrusion Detection,” IEEE Access, vol. 9, pp. 137494–137513, 2021. [CrossRef]
- R. Nematirad and A. Pahwa, “Solar Radiation Forecasting Using Artificial Neural Networks Considering Feature Selection,” in 2022 IEEE Kansas Power and Energy Conference (KPEC), IEEE, Apr. 2022, pp. 1–4. [CrossRef]
- P. Linardatos, V. Papastefanopoulos, and S. Kotsiantis, “Explainable AI: A Review of Machine Learning Interpretability Methods,” Entropy, vol. 23, no. 1, p. 18, Dec. 2020. [CrossRef]
- M. P. Neto and F. V. Paulovich, “Explainable Matrix - Visualization for Global and Local Interpretability of Random Forest Classification Ensembles,” IEEE Trans Vis Comput Graph, vol. 27, no. 2, pp. 1427–1437, Feb. 2021. [CrossRef]
- Yousaf et al., “Emotion Recognition by Textual Tweets Classification Using Voting Classifier (LR-SGD),” IEEE Access, vol. 9, pp. 6286–6295, 2021. [CrossRef]
- H. Cho, Y. Kim, E. Lee, D. Choi, Y. Lee, and W. Rhee, “Basic Enhancement Strategies When Using Bayesian Optimization for Hyperparameter Tuning of Deep Neural Networks,” IEEE Access, vol. 8, pp. 52588–52608, 2020. [CrossRef]
- J. B. Morrison, “Putting the learning curve in context,” J Bus Res, vol. 61, no. 11, pp. 1182–1190, Nov. 2008. [CrossRef]
- K. H. Zou, C.-R. Yu, K. Liu, M. O. Carlsson, and J. Cabrera, “Optimal Thresholds by Maximizing or Minimizing Various Metrics via ROC-Type Analysis,” Acad Radiol, vol. 20, no. 7, pp. 807–815, Jul. 2013. [CrossRef]
- T.-T.-H. Le, H. Kim, H. Kang, and H. Kim, “Classification and Explanation for Intrusion Detection System Based on Ensemble Trees and SHAP Method,” Sensors, vol. 22, no. 3, p. 1154, Feb. 2022. [CrossRef]










Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2024 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (http://creativecommons.org/licenses/by/4.0/).