Preprint Article Version 1 Preserved in Portico This version is not peer-reviewed

A Highly Transparent and Explainable Artificial Intelligence Tool for Chronic Wound Classification: XAI-CWC

Version 1 : Received: 17 January 2021 / Approved: 18 January 2021 / Online: 18 January 2021 (14:28:04 CET)

How to cite: Sarp, S.; Kuzlu, M.; Wilson, E.; Cali, U.; Guler, O. A Highly Transparent and Explainable Artificial Intelligence Tool for Chronic Wound Classification: XAI-CWC. Preprints 2021, 2021010346. https://doi.org/10.20944/preprints202101.0346.v1 Sarp, S.; Kuzlu, M.; Wilson, E.; Cali, U.; Guler, O. A Highly Transparent and Explainable Artificial Intelligence Tool for Chronic Wound Classification: XAI-CWC. Preprints 2021, 2021010346. https://doi.org/10.20944/preprints202101.0346.v1

Abstract

Artificial Intelligence (AI) has seen increased application and widespread adoption over the past decade despite, at times, offering a limited understanding of its inner working. AI algorithms are, in large part, built on weights, and these weights are calculated as a result of large matrix multiplications. Computationally intensive processes are typically harder to interpret. Explainable Artificial Intelligence (XAI) aims to solve this black box approach through the use of various techniques and tools. In this study, XAI techniques are applied to chronic wound classification. The proposed model classifies chronic wounds through the use of transfer learning and fully connected layers. Classified chronic wound images serve as input to the XAI model for an explanation. Interpretable results can help shed new perspectives to clinicians during the diagnostic phase. The proposed method successfully provides chronic wound classification and its associated explanation. This hybrid approach is shown to aid with the interpretation and understanding of AI decision-making processes.

Keywords

Chronic wound classification; transfer learning; explainable artificial intelligence.

Subject

Engineering, Control and Systems Engineering

Comments (0)

We encourage comments and feedback from a broad range of readers. See criteria for comments and our Diversity statement.

Leave a public comment
Send a private comment to the author(s)
* All users must log in before leaving a comment
Views 0
Downloads 0
Comments 0
Metrics 0


×
Alerts
Notify me about updates to this article or when a peer-reviewed version is published.
We use cookies on our website to ensure you get the best experience.
Read more about our cookies here.