Impact of virtual agent facial emotions and attention on N170 ERP amplitude: comparative study

IntroductionIt is known from the literature that face perception of virtual agents affects the amplitude and latency of the ERP components. However, sensitivity of the N170 component to virtual agent facial emotions, and level of attention to facial emotional expressions were not investigated in the...

Full description

Saved in:
Bibliographic Details
Main Authors: Luisa Kirasirova, Olga Maslova, Vasiliy Pyatin
Format: Article
Language:English
Published: Frontiers Media S.A. 2025-02-01
Series:Frontiers in Behavioral Neuroscience
Subjects:
Online Access:https://www.frontiersin.org/articles/10.3389/fnbeh.2025.1523705/full
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:IntroductionIt is known from the literature that face perception of virtual agents affects the amplitude and latency of the ERP components. However, sensitivity of the N170 component to virtual agent facial emotions, and level of attention to facial emotional expressions were not investigated in the virtual reality environment by now, which was the aim of our study.MethodsEEG recording, 2D and 3D visual testing of the neutral, happy and disgusted facial emotions of virtual agents were used. The protocol consisted of three sessions in the attentional condition of participants to each facial emotion (passive, active, and active to neutral facial emotional expression). The amplitudes of the N170 ERP were also reflected in the comparative analysis between 2D and VR.ResultsIn the context of virtual agent facial emotional expressions, we identified the following dynamics of the N170 amplitude: attention (passive/active) showed no signaling effect; active attention to neutral virtual agent facial emotions reduced the N170 amplitude; significant interactions were observed between the factors “emotion × attention” and “environment × attention,” but no interaction was found among all three factors.ConclusionThe immersive quality of the environment in which visual and emotional events are presented has a less pronounced effect on early-stage facial processing at N170 amplitude. Thus, our findings indicate that the N170 amplitude is primarily modulated by the emotional content and attention directed to virtual agent facial emotional expressions.
ISSN:1662-5153