Prompt Tuning Techniques for Chinese Idiom Recommendation

Chinese idioms pose significant challenges in natural language processing (NLP) due to their complex, non-compositional nature and frequent embedded cultural and historical meanings. This study investigates prompt tuning techniques for Chinese idiom recommendation, exploring multiple-choice (MC) pro...

Full description

Saved in:
Bibliographic Details
Main Authors: Shun-Ming Wang, I-Fang Su, Yu-Chi Chung
Format: Article
Language:English
Published: IEEE 2025-01-01
Series:IEEE Access
Subjects:
Online Access:https://ieeexplore.ieee.org/document/10965689/
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:Chinese idioms pose significant challenges in natural language processing (NLP) due to their complex, non-compositional nature and frequent embedded cultural and historical meanings. This study investigates prompt tuning techniques for Chinese idiom recommendation, exploring multiple-choice (MC) prompts, binary classification (BC) prompts, and prompt ensembling strategies. We also introduce an innovative dynamic candidate sampling strategy (DCSS) technique designed to mitigate the overfitting issues commonly encountered with prompt tuning methods on Chinese idiom datasets. Our experimental results demonstrate that prompt tuning combined with ensembling methods significantly improves model performance across multiple datasets. The proposed methods outperform the state-of-the-art (SOTA) methods while maintaining training and inference efficiency. Moreover, we show that prompt tuning can effectively be generalized for other NLP tasks, such as sentiment analysis.
ISSN:2169-3536