Contextual Emotional Transformer-Based Model for Comment Analysis in Mental Health Case Prediction

Mental health (MH) assessment and prediction have become critical areas of focus in healthcare, leveraging developments in natural language processing (NLP). Recent advancements in machine learning have facilitated the exploration of predictive models for MH based on user-generated comments that ove...

Full description

Saved in:
Bibliographic Details
Main Authors: Ayodeji O. J. Ibitoye, Oladosu O. Oladimeji, Olufade F. W. Onifade
Format: Article
Language:English
Published: World Scientific Publishing 2025-08-01
Series:Vietnam Journal of Computer Science
Subjects:
Online Access:https://www.worldscientific.com/doi/10.1142/S2196888824500192
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:Mental health (MH) assessment and prediction have become critical areas of focus in healthcare, leveraging developments in natural language processing (NLP). Recent advancements in machine learning have facilitated the exploration of predictive models for MH based on user-generated comments that overlooked the integration of emotional attention mechanisms. The methods often struggle with contextual nuances and emotional subtleties, leading to suboptimal predictions. The prevailing challenge lies in accurately understanding the emotional context embedded within textual comments, which is crucial for effective prediction and intervention. In this research, we introduce a novel approach employing contextual emotional transformer-based models (CETM) for comment analysis in MH case prediction. CETM leverages state-of-the-art transformer architectures enhanced with contextual embedding layers and emotional attention mechanisms for MH case prediction. By incorporating contextual information and emotional cues, CETM captures the underlying emotional states and MH indicators expressed in user comments. Through extensive experimentation and evaluation, both Roberta and bidirectional encoder representations from transformers (BERT) models exhibited enhanced accuracy, precision, recall, and F1 scores compared to their counterparts lacking emotional attention. Notably, the Roberta model attained a greater accuracy of 94.5% when matched to BERT’s 87.6% when emotional attention was employed. Hence, by incorporating emotional context into the predictive model, we achieved significant improvements, which offers promising avenues for more precise and personalized MH interventions.
ISSN:2196-8888
2196-8896