Abstract
The growing use of Internet of Medical Things (IoMT) systems demands accurate sentiment analysis, nuanced emotion tracking, and seamless integration of multimodal data. Current models often struggle when handling heterogeneous sources and evolving emotional patterns. To address these issues, this study proposes QED-Net—a quantum-inspired deep learning architecture designed specifically for IoMT environments. It introduces four modular components. The quantum-driven sentiment amplification (QSA) enhances contextual sentiment interpretation. The temporal emotion evolution graph (TEEG) captures the shifting nature of emotional states over time. The hyperdimensional quantum tensor fusion (HD-QTF) supports synchronized integration of diverse modalities. Finally, the emotion-to-medical ontology encoder (EMOE) translates emotional cues into actionable clinical signals. These components operate both independently and in synergy, allowing for flexible deployment in real-world scenarios. Simulations conducted on benchmark datasets confirm the model’s effectiveness, with QED-Net achieving 93.2% precision in sentiment detection, 92.3% in emotion tracking, and 91.6% robustness in multimodal fusion.
| Original language | English |
|---|---|
| Journal | IEEE Transactions on Computational Social Systems |
| DOIs | |
| State | Accepted/In press - 2025 |
Bibliographical note
Publisher Copyright:© 2014 IEEE.
Keywords
- Emotion dynamics
- graph modeling
- medical Internet of Things
- quantum computing
- sentiment analysis
ASJC Scopus subject areas
- Modeling and Simulation
- Social Sciences (miscellaneous)
- Human-Computer Interaction