Development of a Knowledge-Distillation-Based Breast Cancer Classifier for LMICs: Comparison with Pruning and Quantization

  • Falmata Modu*
  • , Rajesh Prasad
  • , Farouq Aliyu
  • *Corresponding author for this work

Research output: Contribution to journalArticlepeer-review

Abstract

Breast cancer (BC) mortality rates remain high in Low- and Middle-Income Countries (LMICs) due to limited awareness, poverty, and inadequate medical facilities that hinder early detection. Although deep learning models have achieved high accuracy in BC detection (BCD), they require substantial computational resources, making them unsuitable for deployment in remote or rural areas. This study proposes a lightweight convolutional neural network (CNN) using Knowledge Distillation (KD) for BCD, where a large Teacher Model (TM) transfers learned representations to a smaller Student Model (SM), which is better suited for deployment on low-power devices. We compare it with two prominent model compression techniques: pruning and quantization. Experimental results indicate that the TensorFlow Lite (TFLite)-optimized Student Model (SM_TFLite) achieved 97.67% accuracy, representing a 2.33% relative loss to its teacher, a result comparable to other compression techniques. Its mean accuracy is 73.97% with a 95% Confidence Interval of [65.04%, 82.90%] in a cross-dataset experiment. However, SM_TFLite was the most compact (5.21 kB) and fastest (3.3 ms latency), outperforming both pruned (2924.31 kB, 13.68 ms) and quantized models (746–751 kB, 4–5 ms). Evaluation on a Raspberry Pi 4 Model B demonstrated that all models exhibited similar CPU and memory usage, with SM_TFLite causing only a minor increase in device temperature. These results demonstrate that KD combined with TFLite conversion offers the best trade-off between accuracy, compactness, and speed.

Original languageEnglish
Article number4842
JournalElectronics (Switzerland)
Volume14
Issue number24
DOIs
StatePublished - Dec 2025

Bibliographical note

Publisher Copyright:
© 2025 by the authors.

UN SDGs

This output contributes to the following UN Sustainable Development Goals (SDGs)

  1. SDG 3 - Good Health and Well-being
    SDG 3 Good Health and Well-being

Keywords

  • IoT
  • Keras
  • Knowledge Distillation
  • breast cancer
  • breast cancer detection
  • convolutional neural network
  • deep learning
  • healthcare
  • lightweight CNN
  • pruning
  • quantization
  • telemedicine

ASJC Scopus subject areas

  • Control and Systems Engineering
  • Signal Processing
  • Hardware and Architecture
  • Computer Networks and Communications
  • Electrical and Electronic Engineering

Fingerprint

Dive into the research topics of 'Development of a Knowledge-Distillation-Based Breast Cancer Classifier for LMICs: Comparison with Pruning and Quantization'. Together they form a unique fingerprint.

Cite this