Advancing Computational Humor: LLaMa-3 Based Generation with DistilBert Evaluation Framework
Humor generation presents significant challenges in the field of natural language processing, primarily due to its reliance on cultural backgrounds and subjective interpretations. These factors contribute to the variability of human-generated humor, necessitating computational models capable of mast...
Saved in:
| Main Authors: | He Jinliang, Mei Aohan |
|---|---|
| Format: | Article |
| Language: | English |
| Published: |
EDP Sciences
2025-01-01
|
| Series: | ITM Web of Conferences |
| Online Access: | https://www.itm-conferences.org/articles/itmconf/pdf/2025/01/itmconf_dai2024_03024.pdf |
| Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Similar Items
-
T-LLaMA: a Tibetan large language model based on LLaMA2
by: Hui Lv, et al.
Published: (2024-12-01) -
Leveraging BERT, DistilBERT, and TinyBERT for Rumor Detection
by: Aijazahamed Qazi, et al.
Published: (2025-01-01) -
Identifying artificial intelligence-generated content using the DistilBERT transformer and NLP techniques
by: Hikmat Ullah Khan, et al.
Published: (2025-07-01) -
Optimised knowledge distillation for efficient social media emotion recognition using DistilBERT and ALBERT
by: Muhammad Hussain, et al.
Published: (2025-08-01) -
Risks and Regulations for Application of the LLaMA Model in University Future Learning Centers
by: QIAO Jinhua, MA Xueyun
Published: (2025-02-01)