Automated emotion recognition of students in virtual reality classrooms

In contemporary educational settings, understanding and assessing student engagement through non-verbal cues, especially facial expressions, is pivotal. Such cues have long informed educators about students' cognitive and emotional states, assisting them in tailoring their teaching methods. How...

Full description

Saved in:
Bibliographic Details
Main Authors: Michael Shomoye, Richard Zhao
Format: Article
Language:English
Published: Elsevier 2024-12-01
Series:Computers & Education: X Reality
Subjects:
Online Access:http://www.sciencedirect.com/science/article/pii/S2949678024000321
Tags: Add Tag
No Tags, Be the first to tag this record!
_version_ 1850249661437181952
author Michael Shomoye
Richard Zhao
author_facet Michael Shomoye
Richard Zhao
author_sort Michael Shomoye
collection DOAJ
description In contemporary educational settings, understanding and assessing student engagement through non-verbal cues, especially facial expressions, is pivotal. Such cues have long informed educators about students' cognitive and emotional states, assisting them in tailoring their teaching methods. However, the rise of online learning platforms and advanced technologies such as virtual reality (VR) challenge the conventional modes of gauging student engagement, especially when certain facial features become obscured or are entirely absent. This research explores the potential of Convolutional Neural Networks (CNNs), specifically a custom-trained model adapted from the ResNet50 architecture, in recognizing and distinguishing subtle facial expressions in real-time, such as neutrality, boredom, happiness, and confusion. The novelty of our approach is twofold: First, we optimize the power of CNNs to analyze facial expressions in digital learning platforms. Second, we innovate for the context of VR by focusing on the lower half of the face to tackle occlusion challenges posed by wearing VR headsets. Through comprehensive experimentation, we compare our model's performance with the default ResNet50 model and evaluate it against full-face and VR-occluded face datasets. Ultimately, our endeavor aims to provide educators with a sophisticated tool for real-time evaluation of student engagement in technologically advanced learning environments, subsequently enriching the teaching and learning experience.
format Article
id doaj-art-941fb5c1d1a84a6fbb41625c09e78b20
institution OA Journals
issn 2949-6780
language English
publishDate 2024-12-01
publisher Elsevier
record_format Article
series Computers & Education: X Reality
spelling doaj-art-941fb5c1d1a84a6fbb41625c09e78b202025-08-20T01:58:27ZengElsevierComputers & Education: X Reality2949-67802024-12-01510008210.1016/j.cexr.2024.100082Automated emotion recognition of students in virtual reality classroomsMichael Shomoye0Richard Zhao1Department of Computer Science, University of Calgary, Calgary, Alberta, T2N 1N4, CanadaCorresponding author.; Department of Computer Science, University of Calgary, Calgary, Alberta, T2N 1N4, CanadaIn contemporary educational settings, understanding and assessing student engagement through non-verbal cues, especially facial expressions, is pivotal. Such cues have long informed educators about students' cognitive and emotional states, assisting them in tailoring their teaching methods. However, the rise of online learning platforms and advanced technologies such as virtual reality (VR) challenge the conventional modes of gauging student engagement, especially when certain facial features become obscured or are entirely absent. This research explores the potential of Convolutional Neural Networks (CNNs), specifically a custom-trained model adapted from the ResNet50 architecture, in recognizing and distinguishing subtle facial expressions in real-time, such as neutrality, boredom, happiness, and confusion. The novelty of our approach is twofold: First, we optimize the power of CNNs to analyze facial expressions in digital learning platforms. Second, we innovate for the context of VR by focusing on the lower half of the face to tackle occlusion challenges posed by wearing VR headsets. Through comprehensive experimentation, we compare our model's performance with the default ResNet50 model and evaluate it against full-face and VR-occluded face datasets. Ultimately, our endeavor aims to provide educators with a sophisticated tool for real-time evaluation of student engagement in technologically advanced learning environments, subsequently enriching the teaching and learning experience.http://www.sciencedirect.com/science/article/pii/S2949678024000321Emotion recognitionFacial expressionVirtual realityVirtual learning environment
spellingShingle Michael Shomoye
Richard Zhao
Automated emotion recognition of students in virtual reality classrooms
Computers & Education: X Reality
Emotion recognition
Facial expression
Virtual reality
Virtual learning environment
title Automated emotion recognition of students in virtual reality classrooms
title_full Automated emotion recognition of students in virtual reality classrooms
title_fullStr Automated emotion recognition of students in virtual reality classrooms
title_full_unstemmed Automated emotion recognition of students in virtual reality classrooms
title_short Automated emotion recognition of students in virtual reality classrooms
title_sort automated emotion recognition of students in virtual reality classrooms
topic Emotion recognition
Facial expression
Virtual reality
Virtual learning environment
url http://www.sciencedirect.com/science/article/pii/S2949678024000321
work_keys_str_mv AT michaelshomoye automatedemotionrecognitionofstudentsinvirtualrealityclassrooms
AT richardzhao automatedemotionrecognitionofstudentsinvirtualrealityclassrooms