Lightweight CNN for Resource-Constrained BCD System Using Knowledge Distillation

  • Falmata Modu*
  • , Rajesh Prasad
  • , Farouq Aliyu
  • *Corresponding author for this work

Research output: Contribution to journalArticlepeer-review

2 Scopus citations

Abstract

Breast cancer (BC) remains a leading cause of mortality among women worldwide, with over two million new cases annually. Early detection improves survival rates significantly, yet resource limitations in low- and middle-income countries (LMICs) hinder access to advanced diagnostic tools. While deep learning (DL) models have shown high accuracy in breast cancer detection (BCD), their computational complexity and hardware requirements make them impractical for deployment on low-power devices. To address this, we propose a lightweight convolutional neural network (CNN) for BCD, leveraging knowledge distillation (KD) to transfer knowledge from a complex teacher model (TM) to a smaller student model (SM). Our approach achieves up to 99.3% accuracy, 100% precision, and 99% recall while reducing the number of trainable parameters by 87% compared to conventional deep models. The proposed model successfully runs on a Raspberry Pi 4B with an inference time of 500 ms and memory usage of just 12% (of 8GB RAM), demonstrating its suitability for telemedicine and mobile diagnostics. Additionally, resource utilization experiments confirm that inference remains stable at 10% CPU usage and 40°C when using a heat sink and fan, ensuring sustained deployment. Future work will explore federated learning for decentralized training, integration of multimodal data for enhanced diagnosis, and cloud-based model updates using delay-tolerant networks (DTNs) for remote healthcare applications.

Original languageEnglish
Pages (from-to)57504-57529
Number of pages26
JournalIEEE Access
Volume13
DOIs
StatePublished - 2025

Bibliographical note

Publisher Copyright:
© 2013 IEEE.

Keywords

  • Breast cancer
  • Keras
  • SDG3
  • convolutional neural network
  • deep learning
  • knowledge distillation
  • teacher-student model

ASJC Scopus subject areas

  • General Computer Science
  • General Materials Science
  • General Engineering

Fingerprint

Dive into the research topics of 'Lightweight CNN for Resource-Constrained BCD System Using Knowledge Distillation'. Together they form a unique fingerprint.

Cite this