Multi-HM: A Chinese Multimodal Dataset and Fusion Framework for Emotion Recognition in Human–Machine Dialogue Systems
Sentiment analysis is pivotal in advancing human–computer interaction (HCI) systems as it enables emotionally intelligent responses. While existing models show potential for HCI applications, current conversational datasets exhibit critical limitations in real-world deployment, particularly in captu...
Saved in:
| Main Authors: | Yao Fu, Qiong Liu, Qing Song, Pengzhou Zhang, Gongdong Liao |
|---|---|
| Format: | Article |
| Language: | English |
| Published: |
MDPI AG
2025-04-01
|
| Series: | Applied Sciences |
| Subjects: | |
| Online Access: | https://www.mdpi.com/2076-3417/15/8/4509 |
| Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Similar Items
-
Shuffling Augmented Decoupled Features for Multimodal Emotion Recognition
by: Sunyoung Cho
Published: (2025-01-01) -
DialogueMLLM: Transforming Multimodal Emotion Recognition in Conversation Through Instruction-Tuned MLLM
by: Yuanyuan Sun, et al.
Published: (2025-01-01) -
Af-CAN: Multimodal Emotion Recognition Method Based on Situational Attention Mechanism
by: Xue Zhang, et al.
Published: (2025-01-01) -
Intelligent emotion recognition for drivers using model-level multimodal fusion
by: Xing Luan, et al.
Published: (2025-07-01) -
Causal Inference for Modality Debiasing in Multimodal Emotion Recognition
by: Juyeon Kim, et al.
Published: (2024-12-01)