Preprint Article Version 1 Preserved in Portico This version is not peer-reviewed

Minimum Relevant features to Obtain Explainable Systems for Predicting Cardiovascular Disease Using the Statlog Dataset

Version 1 : Received: 12 December 2020 / Approved: 14 December 2020 / Online: 14 December 2020 (09:49:13 CET)

A peer-reviewed article of this Preprint also exists.

Porto, R.; Molina, J.M.M.; Berlanga, A.; Patricio, M.A.A. Minimum Relevant Features to Obtain Explainable Systems for Predicting Cardiovascular Disease Using the Statlog Data Set. Appl. Sci. 2021, 11, 1285. Porto, R.; Molina, J.M.M.; Berlanga, A.; Patricio, M.A.A. Minimum Relevant Features to Obtain Explainable Systems for Predicting Cardiovascular Disease Using the Statlog Data Set. Appl. Sci. 2021, 11, 1285.

Abstract

Learning systems have been very focused on creating models that are capable of obtaining the best results in error metrics. Recently, the focus has shifted to improvement in order to interpret and explain their results. The need for interpretation is greater when these models are used to support decision making. In some areas this becomes an indispensable requirement, such as in medicine. This paper focuses on the prediction of cardiovascular disease by analyzing the well-known Statlog (Heart) Data Set from the UCI’s Automated Learning Repository. This study will analyze the cost of making predictions easier to interpret by reducing the number of features that explain the classification of health status versus the cost in accuracy. It will be analyzed on a large set of classification techniques and performance metrics. Demonstrating that it is possible to make explainable and reliable models that have a good commitment to predictive performance.

Keywords

Interpretable Artificial Intelligence; Cardiovascular disease prediction; Machine Learning in Healthcare

Subject

Computer Science and Mathematics, Artificial Intelligence and Machine Learning

Comments (0)

We encourage comments and feedback from a broad range of readers. See criteria for comments and our Diversity statement.

Leave a public comment
Send a private comment to the author(s)
* All users must log in before leaving a comment
Views 0
Downloads 0
Comments 0
Metrics 0


×
Alerts
Notify me about updates to this article or when a peer-reviewed version is published.
We use cookies on our website to ensure you get the best experience.
Read more about our cookies here.