Comparative Analysis of Transformers to Support Fine-Grained Emotion Detection in Short-Text Data
Understanding a person’s mood and circumstances by way of sentiment or finer-grained emotion detection can play a significant role in AI systems and applications, such as in chat dialogue or reviews. Analysis of emotion from text typically requires specialized text or document understanding, and rec...
Saved in:
| Main Authors: | , |
|---|---|
| Format: | Article |
| Language: | English |
| Published: |
LibraryPress@UF
2022-05-01
|
| Series: | Proceedings of the International Florida Artificial Intelligence Research Society Conference |
| Subjects: | |
| Online Access: | https://journals.flvc.org/FLAIRS/article/view/130612 |
| Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
| _version_ | 1849735589937545216 |
|---|---|
| author | Robert H. Frye David C. Wilson |
| author_facet | Robert H. Frye David C. Wilson |
| author_sort | Robert H. Frye |
| collection | DOAJ |
| description | Understanding a person’s mood and circumstances by way of sentiment or finer-grained emotion detection can play a significant role in AI systems and applications, such as in chat dialogue or reviews. Analysis of emotion from text typically requires specialized text or document understanding, and recent work has focused on transformer learning approaches. Common models of these transformers (e.g. BERT, RoBERTa, ELECTRA, XLM-R, and XLNet) have been pre-trained using longer texts of well-written English; however, many application contexts align more directly with social media content or have a shorter format more akin to social media, where texts often bend or violate standard language conventions. To understand the applicability and tradeoffs among common transformers within such contexts, our research investigates accuracy and efficiency considerations in fine-tuning transformers for granular emotion detection in short-text data. This paper presents a comparative study investigating the performance of five common transformers as applied in the specific context of multi-category emotion detection in short-text Twitter data. The study explores different considerations for hyperparameter settings in this context. Results show significant fine-tuning benefits in comparison to recommended baselines for the approaches and provide guidance for fine-tuning to support fine-grained emotion detection in short texts. |
| format | Article |
| id | doaj-art-115ff24561904716afa3d54fa3d6de08 |
| institution | DOAJ |
| issn | 2334-0754 2334-0762 |
| language | English |
| publishDate | 2022-05-01 |
| publisher | LibraryPress@UF |
| record_format | Article |
| series | Proceedings of the International Florida Artificial Intelligence Research Society Conference |
| spelling | doaj-art-115ff24561904716afa3d54fa3d6de082025-08-20T03:07:32ZengLibraryPress@UFProceedings of the International Florida Artificial Intelligence Research Society Conference2334-07542334-07622022-05-013510.32473/flairs.v35i.13061266811Comparative Analysis of Transformers to Support Fine-Grained Emotion Detection in Short-Text DataRobert H. Frye0David C. WilsonUniversity of North Carolina at CharlotteUnderstanding a person’s mood and circumstances by way of sentiment or finer-grained emotion detection can play a significant role in AI systems and applications, such as in chat dialogue or reviews. Analysis of emotion from text typically requires specialized text or document understanding, and recent work has focused on transformer learning approaches. Common models of these transformers (e.g. BERT, RoBERTa, ELECTRA, XLM-R, and XLNet) have been pre-trained using longer texts of well-written English; however, many application contexts align more directly with social media content or have a shorter format more akin to social media, where texts often bend or violate standard language conventions. To understand the applicability and tradeoffs among common transformers within such contexts, our research investigates accuracy and efficiency considerations in fine-tuning transformers for granular emotion detection in short-text data. This paper presents a comparative study investigating the performance of five common transformers as applied in the specific context of multi-category emotion detection in short-text Twitter data. The study explores different considerations for hyperparameter settings in this context. Results show significant fine-tuning benefits in comparison to recommended baselines for the approaches and provide guidance for fine-tuning to support fine-grained emotion detection in short texts.https://journals.flvc.org/FLAIRS/article/view/130612transformershyperparametersemotion detectionfine-grained emotion detectionfine-tuning |
| spellingShingle | Robert H. Frye David C. Wilson Comparative Analysis of Transformers to Support Fine-Grained Emotion Detection in Short-Text Data Proceedings of the International Florida Artificial Intelligence Research Society Conference transformers hyperparameters emotion detection fine-grained emotion detection fine-tuning |
| title | Comparative Analysis of Transformers to Support Fine-Grained Emotion Detection in Short-Text Data |
| title_full | Comparative Analysis of Transformers to Support Fine-Grained Emotion Detection in Short-Text Data |
| title_fullStr | Comparative Analysis of Transformers to Support Fine-Grained Emotion Detection in Short-Text Data |
| title_full_unstemmed | Comparative Analysis of Transformers to Support Fine-Grained Emotion Detection in Short-Text Data |
| title_short | Comparative Analysis of Transformers to Support Fine-Grained Emotion Detection in Short-Text Data |
| title_sort | comparative analysis of transformers to support fine grained emotion detection in short text data |
| topic | transformers hyperparameters emotion detection fine-grained emotion detection fine-tuning |
| url | https://journals.flvc.org/FLAIRS/article/view/130612 |
| work_keys_str_mv | AT roberthfrye comparativeanalysisoftransformerstosupportfinegrainedemotiondetectioninshorttextdata AT davidcwilson comparativeanalysisoftransformerstosupportfinegrainedemotiondetectioninshorttextdata |