Abstract
Around the globe, respiratory lung diseases pose a severe threat to human survival. Based on a central goal to reduce contiguous transmission from infected to healthy persons, several technologies have evolved for diagnosing lung pathologies. One of the emerging technologies is the utility of Artificial Intelligence (AI) based on computer vision for processing wide varieties of medical imaging but AI methods without explainability are often treated as a black box. Based on a view to demystifying the rationale influencing AI decisions, this paper designed and developed a novel low-cost explainable deep-learning diagnostic tool for predicting lung disease from medical images. For this, we investigated explainable deep learning (DL) models (conventional DL and vision transformers (ViTs)) for performing prediction of the existence of pneumonia, COVID19, or no-disease from both original and data augmentation (DA)-based medical images (from two chest X-ray datasets). The results show that our experimental consideration of the DA that combines the impact of cropping, rotation, and horizontal flipping (CROP+ROT+HF) for transforming input images and then passed as input to an Inception-V3 architecture yielded a performance that surpasses all the ViTs and other conventional DL approaches in most of the evaluated performance metrics. Overall, the results suggest that the utility of data augmentation schemes aided the DL methods to yield higher classification accuracies. Furthermore, we compared five different class activation mapping (CAM) algorithms (GradCAM, GradCAM++, EigenGradCAM, AblationCAM, and RandomCAM). The result shows that most of the examined CAM algorithms were effective in identifying the attention region containing the existence of pneumonia or COVID-19 from the medical images (chest X-rays). Our developed low-cost AI diagnostic tool (pilot system) can assist medical experts and radiographers in proffering early diagnosis of lung disease. For this, we selected five to seven deep learning models and the explainable algorithms were deployed on a novel web interface implemented via a Gradio framework.
Original language | English |
---|---|
Article number | 108012 |
Journal | Computers in Biology and Medicine |
Volume | 170 |
DOIs | |
State | Published - Mar 2024 |
Bibliographical note
Publisher Copyright:© 2024 Elsevier Ltd
Keywords
- Deep learning
- Explainable artificial intelligence
- Medical imaging
- Software deployment
ASJC Scopus subject areas
- Health Informatics
- Computer Science Applications