Design of a 3D emotion mapping model for visual feature analysis using improved Gaussian mixture models

Enshi Wang*, Fakhri Alam Khan

*Corresponding author for this work

Research output: Contribution to journalArticlepeer-review

1 Scopus citations

Abstract

Given the integration of color emotion space information from multiple feature sources in multimodal recognition systems, effectively fusing this information presents a significant challenge. This article proposes a three-dimensional (3D) color-emotion space visual feature extraction model for multimodal data integration based on an improved Gaussian mixture model to address these issues. Unlike traditional methods, which often struggle with redundant information and high model complexity, our approach optimizes feature fusion by employing entropy and visual feature sequences. By integrating machine vision with six activation functions and utilizing multiple aesthetic features, the proposed method exhibits strong performance in a high emotion mapping accuracy (EMA) of 92.4%, emotion recognition precision (ERP) of 88.35%, and an emotion recognition F1 score (ERFS) of 96.22%. These improvements over traditional approaches highlight the model’s effectiveness in reducing complexity while enhancing emotional recognition accuracy, positioning it as a more efficient solution for visual emotion analysis in multimedia applications. The findings indicate that the model significantly enhances emotional recognition accuracy.

Original languageEnglish
Article numbere2596
JournalPeerJ Computer Science
Volume10
DOIs
StatePublished - 2024

Bibliographical note

Publisher Copyright:
© (2024), (PeerJ Inc.). All rights reserved.

Keywords

  • Emotional modeling
  • Three-dimensional color-emotion space
  • Visual aesthetic features

ASJC Scopus subject areas

  • General Computer Science

Fingerprint

Dive into the research topics of 'Design of a 3D emotion mapping model for visual feature analysis using improved Gaussian mixture models'. Together they form a unique fingerprint.

Cite this