A novel facial expression recognition framework using deep learning based dynamic cross-domain dual attention network
Variations in domain targets have recently posed significant challenges for facial expression recognition tasks, primarily due to domain shifts. Current methods focus largely on global feature adoption to achieve domain-invariant learning; however, transferring local features across diverse domains...
Saved in:
| Main Authors: | , , , , , , |
|---|---|
| Format: | Article |
| Language: | English |
| Published: |
PeerJ Inc.
2025-05-01
|
| Series: | PeerJ Computer Science |
| Subjects: | |
| Online Access: | https://peerj.com/articles/cs-2866.pdf |
| Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
| Summary: | Variations in domain targets have recently posed significant challenges for facial expression recognition tasks, primarily due to domain shifts. Current methods focus largely on global feature adoption to achieve domain-invariant learning; however, transferring local features across diverse domains remains an ongoing challenge. Additionally, during training on target datasets, these methods often suffer from reduced feature representation in the target domain due to insufficient discriminative supervision. To tackle these challenges, we propose a dynamic cross-domain dual attention network for facial expression recognition. Our model is specifically designed to learn domain-invariant features through separate modules for global and local adversarial learning. We also introduce a semantic-aware module to generate pseudo-labels, which computes semantic labels from both global and local features. We assess our model’s effectiveness through extensive experiments on the Real-world Affective Faces Database (RAF-DB), FER-PLUS, AffectNet, Expression in the Wild (ExpW), SFEW 2.0, and Japanese Female Facial Expression (JAFFE) datasets. The results demonstrate that our scheme outperforms the existing state-of-the-art methods by attaining recognition accuracies 93.18, 92.35, 82.13, 78.37, 72.47, 70.68 respectively. |
|---|---|
| ISSN: | 2376-5992 |