Submitted:
19 February 2024
Posted:
19 February 2024
You are already at the latest version
Abstract
Keywords:
1. Introduction
- This paper introduces an innovative approach by integrating Photoplethysmography (PPG) and Electrocardiogram (ECG) signals for heart failure assessment. This integration leverages the unique strengths of both non-invasive monitoring methods to enhance diagnostic accuracy and enable early detection of heart failure.
- The study underscores the clinical relevance of this integrated approach, emphasizing its potential to improve patient care, offer personalized treatment plans, and reduce healthcare costs. By preventing advanced heart failure complications, it has the potential to generate substantial cost savings for healthcare systems.
- The significant improvements achieved by the proposed integrated method in contrast to the results obtained from individual ECG and PPG signals underscore the potency of combining these two modalities. This not only enhances diagnostic accuracy but also highlights the potential for early detection in the assessment and management of heart failure.
2. Materials and Methods
| ICD-9 codes | Name |
|---|---|
| 39891 | Rheumatic heart failure (congestive) |
| 40201 | Malignant hypertensive heart disease with heart failure |
| 40211 | Benign hypertensive heart disease with heart failure |
| 40291 | Unspecified hypertensive heart disease with heart failure |
| 40401 | Hypertensive heart and chronic kidney disease, malignant, with heart failure and with chronic kidney disease stage I through stage IV, or unspecified |
| 40403 | Hypertensive heart and chronic kidney disease, malignant, with heart failure and with chronic kidney disease stage V or end stage renal disease |
| 40411 | Hypertensive heart and chronic kidney disease, benign, with heart failure and with chronic kidney disease stage I through stage IV, or unspecified |
| 40413 | Hypertensive heart and chronic kidney disease, benign, with heart failure and chronic kidney disease stage V or end stage renal disease |
| 40491 | Hypertensive heart and chronic kidney disease, unspecified, with heart failure and with chronic kidney disease stage I through stage IV, or unspecified |
| 40493 | Hypertensive heart and chronic kidney disease, unspecified, with heart failure and chronic kidney disease stage V or end stage renal disease |
| 4280 | Congestive heart failure, unspecified |
| 4281 | Left heart failure |
| 42820 | Systolic heart failure, unspecified |
| 42821 | Acute systolic heart failure |
| 42822 | Chronic systolic heart failure |
| 42823 | Acute on chronic systolic heart failure |
| 42830 | Diastolic heart failure, unspecified |
| 42831 | Acute diastolic heart failure |
| 42832 | Chronic diastolic heart failure |
| 42833 | Acute on chronic diastolic heart failure |
| 42840 | Combined systolic and diastolic heart failure, unspecified |
| 42841 | Acute combined systolic and diastolic heart failure |
| 42842 | Chronic combined systolic and diastolic heart failure |
| 42843 | Acute on chronic combined systolic and diastolic heart failure |
| 4289 | Heart failure, unspecified |
2.2. Feature Extraction
| Feature | Description | Duration | Disease Diagnosis |
|---|---|---|---|
| Pulse pressure | Difference between the systolic and diastolic blood pressure | 0.5 - 10 mmHg | Atherosclerosis Congestive Heart failure |
| Systolic pressure | Indicator of the pulsatile changes in blood volume caused by arterial blood flow | 80 – 120 mmHg | Artery stiffness. Heart Failure |
| P-wave | Atrial depolarization | 0.08 – 0.11s | Heart Failure |
| Diastolic Pressure | Represents the amplitude of the signal during the diastolic phase of the cardiac cycle | <80 mm | Ischemic heart disease Cardiomyopathy |
| Peak to peak interval | Represents the duration between successive peaks in a signal | 0.6 – 1.2s | Atrial fibrillation Heart failure |
| RR interval | The interval between two successive R-waves of the QRS complex ventricular rate | 0.6 – 1.2s | Paroxysmal atrial fibrillation Congestive heart failure |
| Augmentation Index | The difference between systolic and diastolic blood pressure | 20-80 | Heart Failure |
| Heart Rate | A measure of the number of times the heart contracts or beats within a specific time frame, usually one minute | 60 – 100 bpm | Heart Failure Atrial fibrillation |
| QRS interval | Ventricular depolarization | 0.08 – 0.11s | Heart failure Tachycardia Acute Coronary Syndrome |
| RMSSD NN50 pNN50 |
Shows how active the parasympathetic system is relative to the sympathetic nervous system | 19 – 48 ms 5 – 25ms 5% - 18% |
Heart failure Hypertension Arrhythmia Coronary artery disease |
2.3. Feature Normalization
2.4. Feature Importance Analysis and Feature Selection


2.5. Data Partitioning and Classical Machine learning
- Random Forest (RF): Random Forest (RF) stands out as a well-known nonparametric tree-based supervised machine learning method, proficient in handling classification and regression tasks [28]. RF algorithms create numerous machine learning models and consolidate their results to arrive at more robust decisions or estimations than what individual models could achieve in isolation [28]. In comparison to various other machine learning techniques, RF offers distinct advantages. The base estimators within a random forest are trained independently, reducing the training process required for these models [29].
- Adaptive Boost (Adaboost): AdaBoost represents yet another powerful ensemble learning technique applicable to both classification [30]. This appraoch operates by amalgamating multiple weak learners, which individually perform sub-optimally, into a robust learner [31]. AdaBoost is particularly suitable for scenarios with a large number of features, efficiently selecting the most informative ones. Additionally, it effectively addresses class imbalance by adjusting training sample weights, ensuring fair attention to both positive and negative cases during training.
- Naïve Bayes: Naïve Bayes classifiers employ Bayes' probability theorem for data classification. They make an assumption that all features are independent of each other, even though this assumption is simplified, which is why they are referred to as 'naïve.' Bayes' rule calculates the probability of an event based on its relationship with another variable, with a basic representation as follows:Naïve Bayes is computationally efficient, suitable for large datasets and real-time applications.
- Decision Tree (DT): Decision Trees are an intuitive, tree-like structured, non-parametric approach used for classification. To prevent overfitting, Decision Trees use pruning techniques to simplify the tree structure. They accommodate both categorical and numerical features, necessitating minimal data preprocessing and displaying proficiency in handling missing values. This robustness in the face of missing data is especially valuable for real-world situations where data incompleteness and noise are common.
- Support Vector Machine (SVM): SVM accomplishes classification by identifying the optimal hyperplane that improves the gap between classes [32]. In the case of combining non-invasive signals like PPG and ECG, the resulting feature space can be complex and high-dimensional. SVM can handle such data by finding an optimal hyperplane that maximally separates the different classes, even in high-dimensional feature spaces. This capability allows SVM to capture complex relationships and patterns in the data, which can be crucial for accurately classifying heart failure patients.
- K-Nearest Neighbors (KNN): KNN classifies data points by taking a vote based on the class types of their k nearest neighbors in multi-dimensional space. This localized approach is advantageous in scenarios where nearby data points hold significant influence. In the context of heart failure evaluation, KNN can uncover subtle patterns in PPG and ECG signals, aiding in diagnosis. In its basic form, KNN assigns a class depending on the largest class within the k nearest neighbors.
- Multilayer Perceptron (MLP): MLPs are a class of feedforward neural networks [33]. MLPs excel at modeling complex, nonlinear data relationships but require careful design of architecture, including hidden layer count, neuron numbers, and activation functions for optimal performance. It can be applied in a range of domains including image and speech recognition, natural language processing, and medical diagnosis, showcasing their versatility in machine learning [34,35].
- Random Tree (RT): In practice, Random Tree algorithms offer a good balance between simplicity and performance and are a valuable tool for various machine learning tasks [28]. The Random Tree algorithm is built on the foundation of decision trees, which are known for their ability to recursively partition data into subsets based on feature values. However, Random Trees introduce an element of randomness into the decision tree construction process through random feature selection and Bootstrap Aggregating (Bagging).
- Bayesian Net (BayesNet): Bayesian Network is a probabilistic graphical model widely used for various machine learning tasks, especially in fields like healthcare, finance, and natural language processing. It's based on Bayesian probability theory and graph theory, offering a compact and intuitive way to represent complex probabilistic relationships among variables. The effectiveness of BayesNets in managing uncertainty is one of its main advantages.BayesNets offer interpretable results, allowing you to understand how variables influence each other. This is crucial for decision-making in sensitive areas like healthcare.
| Algorithm | Strength’s | Reasons for selection |
|---|---|---|
| SVM (Support Vector Machine) |
• Effective in high-dimensional spaces • Works well with complex datasets • Good generalization capabilities |
Chosen for its ability to handle complex feature spaces and its effectiveness in classification tasks. [36,37] |
| Random Forest | • High predictive accuracy • Handles both numerical and categorical data • Reduces overfitting |
Selected for its robustness and ability to deal with noisy data, which is common in medical datasets [28,29]. |
| K-nearest Neighbor (KNN) | • Simple and intuitive • Non-parametric and adaptable |
Employed for its simplicity and adaptability in classifying data points based on their proximity to neighbors [38,39,40]. |
| Random Tree | • Ensemble method combining decision trees • Resistant to overfitting • Handles mixed data types |
Utilized for its robustness and versatility in handling various data types and potential for accurate classification and can handle missing values efficiently [41,42,43]. |
| AdaBoost | • Sequential ensemble learning • Combines weak learners for a stronger model • Good at handling imbalanced data |
Chosen for its ability to improve model accuracy by sequentially learning from previous models' mistakes [30,43]. |
| BayesNet (Bayesian Network) | • Probabilistic graphical model • Good for modeling dependencies among variables |
Employed for its ability to model complex relationships and dependencies between features in medical data [44,45]. |
| Decision Tree | • Intuitive and easy to understand • Interpretability |
Selected for its simplicity and interpretability, making it useful for gaining insights into feature importance [46,47]. |
| NaiveBayes | • Simple and computationally efficient • Performs well with limited data |
Chosen for its efficiency in handling datasets with limited samples and has a short computational data training time [48,49]. |
| MLP (Multilayer Perceptron) | • Deep learning architecture • Can model complex non-linear relationships • Suitable for large datasets |
Adopted because of its deep architecture, which enables it to recognize complex connections and trends in the data [50,51]. |
3. Results and Discussion
3.1. Result from Classification with Features Extracted from Single PPG Signal

3.2. Result from Classification with Features Extracted from Single ECG Signal
3.3. Result from classification of integrated features extracted from PPG and ECG signals.
3.4. Comparison OF Results Obtained

| Author | Dataset | Signal | Features extracted | Algorithm | Acc. (%) |
Sens. (%) |
Spec (%) |
Pre. (%) |
F1Score | ||
|---|---|---|---|---|---|---|---|---|---|---|---|
| Simge et al. [52] | UCI 300 |
ECG | Chol, trestbps, fbs, restecg, slope | Cubic SVM Linear SVM DT Ensemble |
52.3 67.3 67.7 67.0 |
- |
- |
- |
- |
||
| Ali et al. [53] | UCI | ECG | RestECG, Trestbps, Chol, fbs | KNN SVM NaïveBayes |
80 83 84 |
- |
75 77 80 |
80 82 83 |
- |
||
| Shouman et al [54] | CHDD | ECG | Chol, trestbps, fbs, restecg | GRDT NaïveBayes KNN |
79.1 83.5 83.2 |
75.6 78.0 76.7 |
81.6 80.8 85.1 |
- - - |
- - - |
||
| Tu et al. [55] | CI |
ECG |
Chol, trestbps, fbs, restecg | Bagging DT |
81.41 78.91 |
74.93 72.01 |
86.64 84.48 |
- - |
- - |
||
| Bashir et al. [56] | CHDD 303 |
ECG | Chol, trestbps, fbs, restecg | Ensemble NaïveBayes DT SVM |
81.82 78.79 76.57 86.67 |
73.68 68.49 63.58 73.68 |
92.86 92.86 71.24 79.51 |
- - - - |
82.17 73.61 71.51 65.10 |
||
| Pal et al. [13] | 50 | PPG | Crest-time, Augmentation index, pulse pressure, SA/DA | BT SVM KNN LR |
94 85 83 83 |
95 83 79 83 |
5 87 82 85 |
97 83 97 82 |
96 87 89 85 |
||
| Banerjee et al. [57] | MIMIC II 112 |
PPG | Systolic peak,NN-interval,HRV | SVM |
- |
82 |
88 |
- |
- |
||
| Paradhker et al. [58] | MIMIC II 55 |
PPG |
Augmentation index, stiffness index | SVM |
- |
85 |
78 |
- |
- |
||
| Current Study |
MIMIC III 1636 |
ECG | QRS interval, RR-Interval, HRV, Heart Rate, P-wave |
MLP |
96.40 |
96.70 |
96.00 |
95.30 |
95.90 |
||
|
PPG |
S.P, NN-interval, D.A, P.A, A.I |
RF |
97.10 |
97.05 |
96.88 |
91.20 |
96.66 |
||||
| Integration of PPG and ECG signal | SVM | 98.00 | 7.60 | 96.90 |
97.20 |
97.70 | |||||
4. Limitations of the Study
5. Clinical Application Prospect, Future Work and Conclusion
Author Contributions
Funding
Informed Consent Statement
Data Availability Statement
Acknowledgments
Conflicts of Interest
References
- Groenewegen, A.; Rutten, F.; Mosterd, A.; Hoes, A. Epidemiology of heart failure. European Journal of Heart Failure 2020, 22, 1342–1356. [Google Scholar] [CrossRef] [PubMed]
- An B, Shin J, Kim S, et al. Smart Sensor Systems for Wearable Electronic Devices. Polymers (Basel) 2017, 9, 1–41. [Google Scholar]
- Roger, Veronique L. Epidemiology of heart failure. Circulation Research 2013, 113, 646–659. [Google Scholar] [CrossRef]
- Go A, Mozaffarian D, Roger V, et al. American Heart Association Statistics Committee and Stroke Statistics Subcommittee. Heart disease and stroke statistics--2013 update: a report from the American Heart Association. Circulation 2013, 127, e6–e245. [Google Scholar]
- Yancy C, Jessup M, Bozkurt B, et al. Guideline for the management of heart failure. A report of the American College of Cardiology Foundation/American Heart Association Task Force on Practice Guidelines. Journal of the American College of Cardiology 2013, 62, e147–e239. [Google Scholar]
- Heart Failure by Maheedhar Gedela, MD; Muhammad Khan, MD; and Orvar Jonsson, MD. https://www.researchgate.net/publication/283899687.
- Pielmuş A, Osterland D, Klum, M, et al. Correlation of arterial blood pressure to synchronous piezo, impedance and photoplethysmographic signal features. Current Directions in Biomedical Engineering 2017, 3, 749–753. [Google Scholar] [CrossRef]
- Bruno R, Duranti E, Ippolito C, et al. Different Impact of Essential Hypertension on Structural and Functional Age-Related Vascular Changes. Hypertension 2017, 69, 71–78. [Google Scholar] [CrossRef]
- Allen, J. Photoplethysmography and its application in clinical physiological measurement. Physiol Measur 2007, 28, R1. Available online: http://stacks.iop.org/0967-3334/28/i=3/a=R01. [CrossRef]
- Banerjee R, Vempada R, Mandana K, et al. Identifying coronary artery disease from photoplethysmogram. ACM International Joint Conference 2016, 1084–1088. [Google Scholar] [CrossRef]
- Vo K, Kasaeyan N, Emad N, Amir J, et al., ECG waveform synthesis from PPG with conditional wasserstein generative adversarial networks. 2021, 1030-1036. [CrossRef]
- Paradkar, N.; Chowdhury, S. Coronary artery disease detection using photoplethysmography. Annual International Conference of the IEEE Engineering in Medicine and Biology Society. IEEE Engineering in Medicine and Biology Society. Annual International Conference. 2017, 2017, 100–103. [Google Scholar] [CrossRef]
- Pal P, Ghosh S, Chattopadhyay B, et al. Screening of Ischemic Heart Disease based on PPG Signals using Machine Learning Techniques [C]. 42nd Annual International Conference of the IEEE Engineering in Medicine & Biology Society (EMBC), Montreal, QC, Canada. 2020, 2020, 5980–5983. [CrossRef]
- Bashir, S.; Usman, Q.; Farhan, H. A Multicriteria Weighted Vote Based Classifier Ensemble for Heart Disease Prediction. Computational Intelligence 2016, 32, 615–645. [Google Scholar] [CrossRef]
- Tian, X.; Zhu, Q.; Li, Y.; Wu, M. Cross-Domain Joint Dictionary Learning for ECG Reconstruction from PPG.2020. 936-940. [CrossRef]
- Chen A, Huang S, Hong P, Cheng C, Lin E. HDPS: heart disease prediction system. In Computing in Cardiology. IEEE: Hangzhou, 2022. pp. 557–560.
- Kamath, C. A new approach to detect congestive heart failure using detrended fluctuation analysis of electrocardiogram signals. J Eng Sci Technol 2015, 10, 145–59. [Google Scholar] [CrossRef]
- Moody B, Moody G, Villarroel M, Clifford G, Silva I. MIMIC-III Waveform Database (version 1.0). PhysioNet, 2020. [CrossRef]
- https://www.mathworks.com/discovery/feature-extraction. 2: (Accessed, 5 February 2024.
- Task Force of the European Society of Cardiology and the North American Society of Pacing and Electrophysiology. Heart rate variability: Standards of measurement, physiological interpretation and clinical use. Circulation 1996, 93, 1043–1065. [Google Scholar] [CrossRef]
- Xu H, Li J, Zhong G, et al. Characteristics of the Dynamic Electrocardiogram in the Elderly with Nonvalvular Atrial Fibrillation Combined with Long R-R Intervals. Evidence Based Complement Alternate Medicine 2021, 2021, 4485618. [Google Scholar] [CrossRef]
- Woodward M, Webster R, Murakami Y, et al. The association between resting heart rate, cardiovascular disease and mortality: evidence from 112,680 men and women in 12 cohorts. European Journal of Preventive Cardiology 2014, 21, 719–726. [Google Scholar] [CrossRef]
- Kira, K.; Rendell, L. The feature selection problem: Traditional methods and a new algorithm. In: AAAI, 1992, 129‐134. [CrossRef]
- Kira, K.; Rendell, L. A practical approach to feature selection. In: Proceedings of the ninth international workshop on machine learning (ML92) 1992, 249-256.
- Urbanowicz, R.; Meeker, M.; Cava, W.; Olson, R.; Moore, J. Relief-based feature selection: Introduction and review. Journal of Biomedical Informatics 2018, 85, 189–203. [Google Scholar] [CrossRef]
- Kononenko, I. Estimating attributes: Analysis and extensions of RELIEF. European Conference on Machine Learning 1994, 784, 171–182. [Google Scholar] [CrossRef]
- Witten, I.; Frank, E. Data Mining: Practical machine learning tools and techniques, 2nd Edition, Morgan Kaufmann, San Francisco, 2005.
- Leo Breiman. Random forests. Machine learning 2001, 45, 5–32. [Google Scholar] [CrossRef]
- Liaw, A.; Wiener, M. Classification and regression by random Forest. R News 2001, 2, 18–22. [Google Scholar]
- Freund, Y.; Schapire, R. Experiments with a new boosting algorithm. In ICML, 1996, 2, 148–156. [Google Scholar]
- Freund Y, Schapire R, Abe N. A short introduction to boosting. Journal of Japan Soceity for Artificial Intelligence 1999, 14, 1612. [Google Scholar]
- Corinna, C.; Vapnik, V. Support-vector networks. Journal of Machine Learning 1995, 20, 273–297. [Google Scholar]
- Kurt H, Maxwell S, Halbert White. Multilayer feedforward networks are universal approximators. Neural networks 1989, 2, 359–366. [Google Scholar] [CrossRef]
- Bishop, C. Neural networks for pattern recognition. Oxford university press 1995, 1, 145–164. [Google Scholar]
- Goodfellow I, Bengio Y, Courville A, Bengio Y. Deep learning. MIT Press Cambridge 2016, 8, 170–224. [Google Scholar]
- Yinglin, X. Correlation and association analyses in microbiome study integrating multiomics in health and disease, Editor(s): Jun Sun, Progress in Molecular Biology and Translational Science, Academic Press 2016, (171): 309 – 491. [CrossRef]
- Bhavsar, H.; Ganatra, A. Comparative Study of Training Algorithms for Supervised Machine Learning. International Journal of Soft Computing and Engineering 2012, 2, 2231–2307. [Google Scholar]
- An, Q.; Rahman, S.; Zhou, J.; Kang, J. A Comprehensive Review on Machine Learning in Healthcare Industry: Classification, Restrictions, Opportunities and Challenges. Sensors. 2023, 23, 4178. [Google Scholar] [CrossRef]
- Duneja, A.; Puyalnithi, T. Enhancing classification accuracy of k-nearest neighbors algorithm using gain ratio. Int. Res. J. Eng. Technol 2017, 4, 1385–1388. [Google Scholar]
- Shouman, M.; Turner, T.; Stocker, R. Applying k-nearest neighbour in diagnosing heart disease patients. Int. J. Inf. Educ. Technol. 2012, 2, 220–223. [Google Scholar] [CrossRef]
- Kulkarni, V.; Sinha, P. Efficient Learning of Random Forest Classifier using Disjoint Partitioning Approach. Proceedings of the World Congress on Engineering 2013, 20132, 1–5. [Google Scholar]
- Ying, Mi. Imbalanced Classification Based on Active Learning SMOTE, Research Journal of Applied Sciences, Engineering and Technology 2013, 5, 944-949.
- More, A.; Rana, S.; Dipti, P. Review of random forest classification techniques to resolve data imbalance. 72–78. [CrossRef]
- Friedman N, Geiger D, Goldszmidt Moises. Bayesian Network Classifiers. Machine Learning 2003, 29, 131–163. [Google Scholar] [CrossRef]
- Uusitalo Laura. Advantages and challenges of Bayesian networks in environmental modeling. Ecological Modelling 2006, 203, 312–318. [Google Scholar] [CrossRef]
- Song, Y.; Lu, Y. Decision tree methods: applications for classification and prediction. Shanghai archives of psychiatry 2015, 27, 130–135. [Google Scholar] [CrossRef] [PubMed]
- Vijay, K.; Bala, D. Data Science (Second Edition), Morgan Kaufmann 2019, Pages 65-163, ISBN 9780128147610. [CrossRef]
- De S, Gilberto F, et al. Engineering Systemsʹ Fault Diagnosis Methods. Reliability Analysis and Asset Management of Engineering Systems. 2021; 165-187, Accessed 16 Dec. 2023. [CrossRef]
- Dulhare, U. Prediction system for heart disease using Naive Bayes and particle swarm optimization. Biomed. Res. 2018, 29, 2646–2649. [Google Scholar] [CrossRef]
- Akkaya B, Çolakoğlu N Comparison of Multi-class Classification Algorithms on Early Diagnosis of Heart Diseases. 2019, 3, 261 -311.
- Bikku, T. Multi-layered deep learning perceptron approach for health risk prediction. J Big Data 2020, 7, 50. [Google Scholar] [CrossRef]
- Ekiz, S.; Pakize, E. Comparative study of heart disease classification. 2017 Electric Electronics, Computer Science, Biomedical Engineerings' Meeting (EBBT) (2017): 1 N/A4.
- Nassif, A.; Mahdi, O.; Nasir, Q.; Talib, M.; Azzeh, M. Machine learning classifications of coronary artery disease. In Proceedings of the 2018 International Joint Symposium on Artificial Intelligence and Natural Language Processing (iSAI N/ANLP), Pattaya, Thailand, 15–18 November 2018; IEEE: New York, NY, USA. pp. 1–6. [Google Scholar]
- Shouman, M.; Turner, T.; Stocker, R. Integrating Naive Bayes and K-means clustering with different initial centroid selection methods in the diagnosis of heart disease patients. Computer Science and Information Technology.
- Chau, T.; Dongil, S.; Dongkyoo, S. Effective Diagnosis of Heart Disease through Bagging Approach. IEEE 2nd International Conference on Biomedical Engineering and Informatics - Tianjin, China 2009, 10, 1–4. [Google Scholar] [CrossRef]
- Bashir S, et al. A Multicriteria Weighted Vote N/ABased Classifier Ensemble for Heart Disease Prediction. Computational Intelligence 2016, 32, 615. [Google Scholar] [CrossRef]
- Banerjee, R.; Vempada, R.; Mandana, K.; Choudhury, A.; Dutta, P. Identifying coronary artery disease from photoplethysmogram. Proceedings of the 2016 ACM International Joint Conference on Pervasive and Ubiquitous Computing Adjunct 1084–1088. 1084–1088. [CrossRef]
- Paradkar, N.; Chowdhury, S. Coronary artery disease detection using photoplethysmography. Annual International Conference of the IEEE Engineering in Medicine and Biology Society. IEEE Engineering in Medicine and Biology Society. Annual International Conference, 2017; 2017, 100–103. [Google Scholar] [CrossRef]




| Class of Feature | Features | Information provided |
|---|---|---|
| Class 1 (Amplitude features) |
Pulse Pressure Systolic Pressure Diastolic Pressure P-wave |
Monitoring these amplitude features over time can provide insights into the progression of heart failure and the effectiveness of therapeutic interventions aimed at managing vascular resistance. |
| Class 2 (Interval Information) |
Peak to peak interval QRS interval RR interval |
Changes in these intervals can indicate alterations in cardiac function and hemodynamics associated with heart failure. Researchers can gain insights into the pathophysiology of heart failure and assess the severity of the condition. |
| Class 3 (Physiological features) |
Augmentation Index HRV Parameters (pNN50, NN50, RMSSD, SDNN) Heart Rate |
They offer insights into heart function, blood flow, arterial stiffness, and autonomic nervous system activity. |
| Model | Performance Metrics | |||||
|---|---|---|---|---|---|---|
| Accuracy (%) |
Sensitivity (%) |
Specificity (%) |
Precision (%) |
AUC (%) | F1-Score (%) |
|
| SVM | 98.00 | 97.60 | 96.90 | 97.20 | 98.80 | 97.70 |
| Random Forest | 96.80 | 96.70 | 96.90 | 96.20 | 99.60 | 96.40 |
| K-NN | 94.90 | 79.30 | 95.70 | 94.80 | 95.30 | 86.20 |
| Random Tree | 96.90 | 96.70 | 96.80 | 96.20 | 96.80 | 98.50 |
| AdaBoost | 96.90 | 96.80 | 96.80 | 85.10 | 99.60 | 91.87 |
| BayesNet | 95.50 | 95.70 | 95.50 | 94.60 | 96.80 | 95.20 |
| Decision Tree | 96.00 | 95.70 | 96.40 | 95.70 | 96.80 | 95.70 |
| NaiveBayes | 91.20 | 91.30 | 91.00 | 89.40 | 95.20 | 90.30 |
| MLP | 96.50 | 96.80 | 96.50 | 95.70 | 99.80 | 96.30 |
| Model | Performance Metrics | ||||||
|---|---|---|---|---|---|---|---|
| Accuracy (%) |
Sensitivity (%) |
Specificity (%) |
Precision (%) |
F1Score (%) |
AUC (%) |
||
| PPG | Random Forest | 97.10 | 97.05 | 96.88 | 96.28 | 97.20 | 96.66 |
| ECG | MLP | 96.40 | 96.70 | 96.00 | 95.30 | 95.90 | 95.60 |
| Integration | SVM | 98.00 | 97.60 | 96.90 | 97.20 | 98.40 | 97.70 |
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2024 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (http://creativecommons.org/licenses/by/4.0/).