Optimised knowledge distillation for efficient social media emotion recognition using DistilBERT and ALBERT
Abstract Accurate emotion recognition in social media text is critical for applications such as sentiment analysis, mental health monitoring, and human-computer interaction. However, existing approaches face challenges like computational complexity and class imbalance, limiting their deployment in r...
Saved in:
| Main Authors: | Muhammad Hussain, Caikou Chen, Muzammil Hussain, Muhammad Anwar, Mohammed Abaker, Abdelzahir Abdelmaboud, Iqra Yamin |
|---|---|
| Format: | Article |
| Language: | English |
| Published: |
Nature Portfolio
2025-08-01
|
| Series: | Scientific Reports |
| Subjects: | |
| Online Access: | https://doi.org/10.1038/s41598-025-16001-9 |
| Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Similar Items
-
Evaluating sentiment analysis models: A comparative analysis of vaccination tweets during the COVID-19 phase leveraging DistilBERT for enhanced insights
by: Renuka Agrawal, et al.
Published: (2025-06-01) -
Enhancing consistency in piping and instrumentation diagrams using DistilBERT and smart PID systems
by: F.S. Gómez-Vega, et al.
Published: (2025-12-01) -
Autocorrelation Matrix Knowledge Distillation: A Task-Specific Distillation Method for BERT Models
by: Kai Zhang, et al.
Published: (2024-10-01) -
Emotion on the edge: An evaluation of feature representations and machine learning models
by: James Thomas Black, et al.
Published: (2025-03-01) -
Leveraging BERT, DistilBERT, and TinyBERT for Rumor Detection
by: Aijazahamed Qazi, et al.
Published: (2025-01-01)