Emotion-Aware RoBERTa enhanced with emotion-specific attention and TF-IDF gating for fine-grained emotion recognition

Abstract Emotion recognition in text is a fundamental task in natural language processing, underpinning applications such as sentiment analysis, mental health monitoring, and content moderation. Although transformer-based models like RoBERTa have advanced contextual understanding in text, they still...

Full description

Saved in:
Bibliographic Details
Main Authors: Fatimah Alqarni, Alaa Sagheer, Amira Alabbad, Hala Hamdoun
Format: Article
Language:English
Published: Nature Portfolio 2025-05-01
Series:Scientific Reports
Subjects:
Online Access:https://doi.org/10.1038/s41598-025-99515-6
Tags: Add Tag
No Tags, Be the first to tag this record!
_version_ 1850268779749048320
author Fatimah Alqarni
Alaa Sagheer
Amira Alabbad
Hala Hamdoun
author_facet Fatimah Alqarni
Alaa Sagheer
Amira Alabbad
Hala Hamdoun
author_sort Fatimah Alqarni
collection DOAJ
description Abstract Emotion recognition in text is a fundamental task in natural language processing, underpinning applications such as sentiment analysis, mental health monitoring, and content moderation. Although transformer-based models like RoBERTa have advanced contextual understanding in text, they still face limitations in identifying subtle emotional cues, handling class imbalances, and processing noisy or informal input. To address these challenges, this paper introduces Emotion-Aware RoBERTa, an enhanced framework that integrates an Emotion-Specific Attention (ESA) layer and a TF-IDF based gating mechanism. These additions are designed to dynamically prioritize emotionally salient tokens while suppressing irrelevant content, thereby improving both classification accuracy and robustness. The model achieved 96.77% accuracy and a weighted F1-score of 0.97 on the primary dataset, outperforming baseline RoBERTa and other benchmark models such as DistilBERT and ALBERT with a relative improvement ranging from 9.68% to 10.87%. Its generalization capability was confirmed across two external datasets, achieving 88.03% on a large-scale corpus and 65.67% on a smaller, noisier dataset. An ablation study revealed the complementary impact of the ESA and TF-IDF components, balancing performance and inference efficiency. Attention heatmaps were used to visualize ESA’s ability to focus on key emotional expressions, while inference-time optimizations using FP16 and Automatic Mixed Precision (AMP) reduced memory consumption and latency. Additionally, McNemar’s statistical test confirmed the significance of the improvements over the baseline. These findings demonstrate that Emotion-Aware RoBERTa offers a scalable, interpretable, and deployment-friendly solution for fine-grained emotion recognition, making it well-suited for real-world NLP applications in emotion-aware systems.
format Article
id doaj-art-aaf2804cdbf541b9a5f2e0dc111a121c
institution OA Journals
issn 2045-2322
language English
publishDate 2025-05-01
publisher Nature Portfolio
record_format Article
series Scientific Reports
spelling doaj-art-aaf2804cdbf541b9a5f2e0dc111a121c2025-08-20T01:53:22ZengNature PortfolioScientific Reports2045-23222025-05-0115111910.1038/s41598-025-99515-6Emotion-Aware RoBERTa enhanced with emotion-specific attention and TF-IDF gating for fine-grained emotion recognitionFatimah Alqarni0Alaa Sagheer1Amira Alabbad2Hala Hamdoun3Department of Computer Science, College of Computer Sciences and Information Technology, King Faisal UniversityDepartment of Computer Science, College of Computer Sciences and Information Technology, King Faisal UniversityDepartment of Computer Science, College of Computer Sciences and Information Technology, King Faisal UniversityDepartment of Computer Science, College of Computer Sciences and Information Technology, King Faisal UniversityAbstract Emotion recognition in text is a fundamental task in natural language processing, underpinning applications such as sentiment analysis, mental health monitoring, and content moderation. Although transformer-based models like RoBERTa have advanced contextual understanding in text, they still face limitations in identifying subtle emotional cues, handling class imbalances, and processing noisy or informal input. To address these challenges, this paper introduces Emotion-Aware RoBERTa, an enhanced framework that integrates an Emotion-Specific Attention (ESA) layer and a TF-IDF based gating mechanism. These additions are designed to dynamically prioritize emotionally salient tokens while suppressing irrelevant content, thereby improving both classification accuracy and robustness. The model achieved 96.77% accuracy and a weighted F1-score of 0.97 on the primary dataset, outperforming baseline RoBERTa and other benchmark models such as DistilBERT and ALBERT with a relative improvement ranging from 9.68% to 10.87%. Its generalization capability was confirmed across two external datasets, achieving 88.03% on a large-scale corpus and 65.67% on a smaller, noisier dataset. An ablation study revealed the complementary impact of the ESA and TF-IDF components, balancing performance and inference efficiency. Attention heatmaps were used to visualize ESA’s ability to focus on key emotional expressions, while inference-time optimizations using FP16 and Automatic Mixed Precision (AMP) reduced memory consumption and latency. Additionally, McNemar’s statistical test confirmed the significance of the improvements over the baseline. These findings demonstrate that Emotion-Aware RoBERTa offers a scalable, interpretable, and deployment-friendly solution for fine-grained emotion recognition, making it well-suited for real-world NLP applications in emotion-aware systems.https://doi.org/10.1038/s41598-025-99515-6Natural language processingTransformersDeep learningEmotion recognitionAttention mechanismText classification
spellingShingle Fatimah Alqarni
Alaa Sagheer
Amira Alabbad
Hala Hamdoun
Emotion-Aware RoBERTa enhanced with emotion-specific attention and TF-IDF gating for fine-grained emotion recognition
Scientific Reports
Natural language processing
Transformers
Deep learning
Emotion recognition
Attention mechanism
Text classification
title Emotion-Aware RoBERTa enhanced with emotion-specific attention and TF-IDF gating for fine-grained emotion recognition
title_full Emotion-Aware RoBERTa enhanced with emotion-specific attention and TF-IDF gating for fine-grained emotion recognition
title_fullStr Emotion-Aware RoBERTa enhanced with emotion-specific attention and TF-IDF gating for fine-grained emotion recognition
title_full_unstemmed Emotion-Aware RoBERTa enhanced with emotion-specific attention and TF-IDF gating for fine-grained emotion recognition
title_short Emotion-Aware RoBERTa enhanced with emotion-specific attention and TF-IDF gating for fine-grained emotion recognition
title_sort emotion aware roberta enhanced with emotion specific attention and tf idf gating for fine grained emotion recognition
topic Natural language processing
Transformers
Deep learning
Emotion recognition
Attention mechanism
Text classification
url https://doi.org/10.1038/s41598-025-99515-6
work_keys_str_mv AT fatimahalqarni emotionawarerobertaenhancedwithemotionspecificattentionandtfidfgatingforfinegrainedemotionrecognition
AT alaasagheer emotionawarerobertaenhancedwithemotionspecificattentionandtfidfgatingforfinegrainedemotionrecognition
AT amiraalabbad emotionawarerobertaenhancedwithemotionspecificattentionandtfidfgatingforfinegrainedemotionrecognition
AT halahamdoun emotionawarerobertaenhancedwithemotionspecificattentionandtfidfgatingforfinegrainedemotionrecognition