Preprint Article Version 1 Preserved in Portico This version is not peer-reviewed

DeClEx-Processing Pipeline for Critical Healthcare Application

Version 1 : Received: 14 May 2024 / Approved: 14 May 2024 / Online: 14 May 2024 (13:07:32 CEST)

How to cite: Shinde, G.; Goniguntla, S. C.; Shirur, P.; Hambaba, A. DeClEx-Processing Pipeline for Critical Healthcare Application. Preprints 2024, 2024050947. https://doi.org/10.20944/preprints202405.0947.v1 Shinde, G.; Goniguntla, S. C.; Shirur, P.; Hambaba, A. DeClEx-Processing Pipeline for Critical Healthcare Application. Preprints 2024, 2024050947. https://doi.org/10.20944/preprints202405.0947.v1

Abstract

Over the last decade, the prevalence of health issues has increased by approximately 29.1%, putting a substantial strain on healthcare services. This has accelerated the integration of machine learning in healthcare, particularly following the COVID-19 pandemic. The utilization of machine learning in healthcare has grown significantly, however many present approaches are unsuitable for real-world implementation due to high memory footprints and lack of interpretability. We introduce DeClEx, a pipeline designed to address these issues. DeClEx ensures that data mirrors real-world settings by incorporating gaussian noise and blur and employing autoencoders to learn intermediate feature representations. Subsequently, our convolutional neural network, paired with spatial attention, provides comparable accuracy to state of the art pre-trained models while achieving threefold improvement in training speed. Furthermore, we provide interpretable results using explainable AI techniques. We integrate denoising and deblurring, classification and explainability in a single pipeline called DeClEx.

Keywords

machine learning; healthcare; classification; explainability

Subject

Computer Science and Mathematics, Artificial Intelligence and Machine Learning

Comments (0)

We encourage comments and feedback from a broad range of readers. See criteria for comments and our Diversity statement.

Leave a public comment
Send a private comment to the author(s)
* All users must log in before leaving a comment
Views 0
Downloads 0
Comments 0
Metrics 0


×
Alerts
Notify me about updates to this article or when a peer-reviewed version is published.
We use cookies on our website to ensure you get the best experience.
Read more about our cookies here.