Staged transfer learning for multi-label half-face emotion recognition

Abstract As fundamental drivers of human behavior, emotions can be expressed through various modalities, including facial expressions. Facial emotion recognition (FER) has emerged as a pivotal area of affective computing, enabling accurate detection of human emotions from visual cues. To enhance the...

Full description

Saved in:
Bibliographic Details
Main Authors: Mohamed M. Abd ElMaksoud, Sherif H. ElGohary, Ahmed H. Kandil
Format: Article
Language:English
Published: SpringerOpen 2025-05-01
Series:Journal of Engineering and Applied Science
Subjects:
Online Access:https://doi.org/10.1186/s44147-025-00615-x
Tags: Add Tag
No Tags, Be the first to tag this record!
_version_ 1850273007657811968
author Mohamed M. Abd ElMaksoud
Sherif H. ElGohary
Ahmed H. Kandil
author_facet Mohamed M. Abd ElMaksoud
Sherif H. ElGohary
Ahmed H. Kandil
author_sort Mohamed M. Abd ElMaksoud
collection DOAJ
description Abstract As fundamental drivers of human behavior, emotions can be expressed through various modalities, including facial expressions. Facial emotion recognition (FER) has emerged as a pivotal area of affective computing, enabling accurate detection of human emotions from visual cues. To enhance the efficiency and maintain accuracy, we propose a novel approach that leverages deep learning and transfer learning techniques to classify emotions based on only half of the human face. We introduce EMOFACE, a comprehensive half-facial imagery dataset annotated with 25 distinct emotion labels, providing a diverse and inclusive resource for multi-label half-facial emotion classification. By combining this dataset with the established FER2013 dataset, we employ a staged transfer learning framework that effectively addresses the challenges of multi-label half facial emotion classification. Our proposed approach, which utilizes a custom convolutional neural network (ConvNet) and five pre-trained deep learning models (VGG16, VGG19, DenseNet, MobileNet, and ResNet), achieves impressive results. We report an average binary accuracy of 0.9244 for training, 0.9152 for validation, and 0.9138 for testing, demonstrating the efficacy of our method. The potential applications of this research extend to various domains, including affective computing, healthcare, robotics, human–computer interaction, and self-driving cars. By advancing the field of half-facial multi-label emotion recognition, our work contributes to the development of more intuitive and empathetic human–machine interactions.
format Article
id doaj-art-3fccd1a64c0c485a966a41a239c6ddc8
institution OA Journals
issn 1110-1903
2536-9512
language English
publishDate 2025-05-01
publisher SpringerOpen
record_format Article
series Journal of Engineering and Applied Science
spelling doaj-art-3fccd1a64c0c485a966a41a239c6ddc82025-08-20T01:51:38ZengSpringerOpenJournal of Engineering and Applied Science1110-19032536-95122025-05-0172112510.1186/s44147-025-00615-xStaged transfer learning for multi-label half-face emotion recognitionMohamed M. Abd ElMaksoud0Sherif H. ElGohary1Ahmed H. Kandil2Systems and Biomedical Engineering Department, Faculty of Engineering, Cairo UniversitySystems and Biomedical Engineering Department, Faculty of Engineering, Cairo UniversitySystems and Biomedical Engineering Department, Faculty of Engineering, Cairo UniversityAbstract As fundamental drivers of human behavior, emotions can be expressed through various modalities, including facial expressions. Facial emotion recognition (FER) has emerged as a pivotal area of affective computing, enabling accurate detection of human emotions from visual cues. To enhance the efficiency and maintain accuracy, we propose a novel approach that leverages deep learning and transfer learning techniques to classify emotions based on only half of the human face. We introduce EMOFACE, a comprehensive half-facial imagery dataset annotated with 25 distinct emotion labels, providing a diverse and inclusive resource for multi-label half-facial emotion classification. By combining this dataset with the established FER2013 dataset, we employ a staged transfer learning framework that effectively addresses the challenges of multi-label half facial emotion classification. Our proposed approach, which utilizes a custom convolutional neural network (ConvNet) and five pre-trained deep learning models (VGG16, VGG19, DenseNet, MobileNet, and ResNet), achieves impressive results. We report an average binary accuracy of 0.9244 for training, 0.9152 for validation, and 0.9138 for testing, demonstrating the efficacy of our method. The potential applications of this research extend to various domains, including affective computing, healthcare, robotics, human–computer interaction, and self-driving cars. By advancing the field of half-facial multi-label emotion recognition, our work contributes to the development of more intuitive and empathetic human–machine interactions.https://doi.org/10.1186/s44147-025-00615-xDeep learningMachine learningImage processingTransfer learningComputer visionFacial emotion recognition
spellingShingle Mohamed M. Abd ElMaksoud
Sherif H. ElGohary
Ahmed H. Kandil
Staged transfer learning for multi-label half-face emotion recognition
Journal of Engineering and Applied Science
Deep learning
Machine learning
Image processing
Transfer learning
Computer vision
Facial emotion recognition
title Staged transfer learning for multi-label half-face emotion recognition
title_full Staged transfer learning for multi-label half-face emotion recognition
title_fullStr Staged transfer learning for multi-label half-face emotion recognition
title_full_unstemmed Staged transfer learning for multi-label half-face emotion recognition
title_short Staged transfer learning for multi-label half-face emotion recognition
title_sort staged transfer learning for multi label half face emotion recognition
topic Deep learning
Machine learning
Image processing
Transfer learning
Computer vision
Facial emotion recognition
url https://doi.org/10.1186/s44147-025-00615-x
work_keys_str_mv AT mohamedmabdelmaksoud stagedtransferlearningformultilabelhalffaceemotionrecognition
AT sherifhelgohary stagedtransferlearningformultilabelhalffaceemotionrecognition
AT ahmedhkandil stagedtransferlearningformultilabelhalffaceemotionrecognition