Previous studies regarding the perception of emotions for embodied virtual agents have shown the effectiveness of using virtual characters in conveying emotions through interactions with humans. However, creating an autonomous embodied conversational agent with expressive behaviors presents two major challenges. The first challenge is the difficulty of synthesizing the conversational behaviors for each modality that are as expressive as real human behaviors. The second challenge is that the affects are modeled independently, which makes it difficult to generate multimodal responses with consistent emotions across all modalities. In this work, we propose a conceptual framework, ACTOR (Affect-Consistent mulTimodal behaviOR generation), that aims to increase the perception of affects by generating multimodal behaviors conditioned on a consistent driving affect. We have conducted a user study with 199 participants to assess how the average person judges the affects perceived from multimodal behaviors that are consistent and inconsistent with respect to a driving affect. The result shows that among all model conditions, our affect-consistent framework receives the highest Likert scores for the perception of driving affects. Our statistical analysis suggests that making a modality affect-inconsistent significantly decreases the perception of driving affects. We also observe that multimodal behaviors conditioned on consistent affects are more expressive compared to behaviors with inconsistent affects. Therefore, we conclude that multimodal emotion conditioning and affect consistency are vital to enhancing the perception of affects for embodied conversational agents.
|Title of host publication||IUI 2023 - Proceedings of the 28th International Conference on Intelligent User Interfaces|
|Publisher||Association for Computing Machinery|
|Number of pages||12|
|State||Published - 27 Mar 2023|
|Event||28th International Conference on Intelligent User Interfaces, IUI 2023 - Sydney, Australia|
Duration: 27 Mar 2023 → 31 Mar 2023
|Name||International Conference on Intelligent User Interfaces, Proceedings IUI|
|Conference||28th International Conference on Intelligent User Interfaces, IUI 2023|
|Period||27/03/23 → 31/03/23|
Bibliographical noteFunding Information:
The research was supported in part by NSF awards: IIS-1703883, IIS-1955404, IIS-1955365, RETTL-2119265, and EAGER-2122119. This material is based upon work supported by the U.S. Department of Homeland Security1 under Grant Award Number 22STESE00001 01 01. This publication is based upon work supported by King Fahd University of Petroleum & Minerals. Author(s) at KFUPM acknowledge the Interdisciplinary Research Center for Intelligent Secure Systems for the support received under Grant Number INSS2305.
© 2023 ACM.
- affect consistency
- embodied conversational agents
- emotion conditioning
- multimodal behavior generation
ASJC Scopus subject areas
- Human-Computer Interaction