Construction of Prompt Verbalizer Based on Dynamic Search Tree for Text Classification

Prompt tuning has shown impressive performance in the domain of few-shot text classification tasks, yet the coverage of its crucial module, i.e., the verbalizer, has a considerable effect on the results. Existing methods have not addressed breadth and depth in constructing the verbalizer. Specifical...

Full description

Saved in:
Bibliographic Details
Main Authors: Jinfeng Gao, Xianliang Xia, Ruxian Yao, Junming Zhang, Yu Zhang
Format: Article
Language:English
Published: IEEE 2025-01-01
Series:IEEE Access
Subjects:
Online Access:https://ieeexplore.ieee.org/document/10848120/
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:Prompt tuning has shown impressive performance in the domain of few-shot text classification tasks, yet the coverage of its crucial module, i.e., the verbalizer, has a considerable effect on the results. Existing methods have not addressed breadth and depth in constructing the verbalizer. Specifically, breadth refers to the cross-granularity issue of label words, while depth refers to the number of elements within a granularity that make a positive contribution to classification. This study proposes a dynamic search tree (DST) method to enhance the coverage of the verbalizer further. The core idea is to utilize the hierarchical relationships within the tree to automatically unearth concealed high-quality words, thereby ensuring that the constructed verbalizer possesses both higher breadth and depth. DST involves amalgamating knowledgeable prompt tuning (KPT) by leveraging the breadth of the KPT&#x2019;s label word space, which encompasses characteristics at various granularities and from various perspectives, thereby addressing the problem of the verbalizer&#x2019;s breadth. Subsequently, a method that is capable of measuring the interrelation between words on a designated feature is proposed by analyzing the word vector, which successfully eradicates the noise introduced by irrelevant dimensions during the process of extending the verbalizer and effectively enhances the quality of the verbalizer in terms of depth. Extensive experiments were conducted on zero- and few-shot text classification tasks to demonstrate the effectiveness of our method. Our source code is publicly available at <uri>https://github.com/XianliangXia/VerbalizerConstrucionByDynamicSearchTree</uri>.
ISSN:2169-3536