Text classification model based on GNN and attention mechanism

Addressing the issue of low classification accuracy raised by the poor performance of the model, which is caused by the difficulty in learning from dynamic aggregation unknown neighboring nodes of graph data and insufficient fusion of semantic features, a model named graph attention text classificat...

Full description

Saved in:
Bibliographic Details
Main Authors: ZENG Shuifei, MENG Yao, LIU Jing
Format: Article
Language:zho
Published: Beijing Xintong Media Co., Ltd 2025-05-01
Series:Dianxin kexue
Subjects:
Online Access:http://www.telecomsci.com/thesisDetails#10.11959/j.issn.1000-0801.2025136
Tags: Add Tag
No Tags, Be the first to tag this record!
_version_ 1849470572890685440
author ZENG Shuifei
MENG Yao
LIU Jing
author_facet ZENG Shuifei
MENG Yao
LIU Jing
author_sort ZENG Shuifei
collection DOAJ
description Addressing the issue of low classification accuracy raised by the poor performance of the model, which is caused by the difficulty in learning from dynamic aggregation unknown neighboring nodes of graph data and insufficient fusion of semantic features, a model named graph attention text classification(GATC) based on graph neural network (GNN) and attention mechanism was proposed. Firstly, an inductive learning of graph neural network model was constructed, and dynamic embedding the unknown neighboring node was implemented by using an aggregation function to enhance the model’s generalization ability. Secondly, the reasoning cache size of key-value was reduced by the introduction of multi-head latent attention mechanism that utilized the low-rank key-value joint compression technology, which significantly diminished memory usage and improved the performance of the model. Finally, the integration of GNN and gated recurrent unit (GRU) network models further captured the semantic feature information of structural and temporal attributes for graph data, resulting in achieving efficient feature fusion and improving the classification accuracy of the model. The experimental results show that the proposed method not only is effective, but also improves the accuracy of classification that is increased at least 4.0%, 2.4% and 3.1% on the CSI 100,CSI 300 and Rus 1K datasets, respectively, compared with the algorithm ADGL+MLA (adaptive dynamic graph learning+multi-head latent attention).
format Article
id doaj-art-cb09f13da1be4ae1bb4d5d0d91ce47dd
institution Kabale University
issn 1000-0801
language zho
publishDate 2025-05-01
publisher Beijing Xintong Media Co., Ltd
record_format Article
series Dianxin kexue
spelling doaj-art-cb09f13da1be4ae1bb4d5d0d91ce47dd2025-08-20T03:25:07ZzhoBeijing Xintong Media Co., LtdDianxin kexue1000-08012025-05-0141129140108554356Text classification model based on GNN and attention mechanismZENG ShuifeiMENG YaoLIU JingAddressing the issue of low classification accuracy raised by the poor performance of the model, which is caused by the difficulty in learning from dynamic aggregation unknown neighboring nodes of graph data and insufficient fusion of semantic features, a model named graph attention text classification(GATC) based on graph neural network (GNN) and attention mechanism was proposed. Firstly, an inductive learning of graph neural network model was constructed, and dynamic embedding the unknown neighboring node was implemented by using an aggregation function to enhance the model’s generalization ability. Secondly, the reasoning cache size of key-value was reduced by the introduction of multi-head latent attention mechanism that utilized the low-rank key-value joint compression technology, which significantly diminished memory usage and improved the performance of the model. Finally, the integration of GNN and gated recurrent unit (GRU) network models further captured the semantic feature information of structural and temporal attributes for graph data, resulting in achieving efficient feature fusion and improving the classification accuracy of the model. The experimental results show that the proposed method not only is effective, but also improves the accuracy of classification that is increased at least 4.0%, 2.4% and 3.1% on the CSI 100,CSI 300 and Rus 1K datasets, respectively, compared with the algorithm ADGL+MLA (adaptive dynamic graph learning+multi-head latent attention).http://www.telecomsci.com/thesisDetails#10.11959/j.issn.1000-0801.2025136GNNattention mechanismgraph datatext classification
spellingShingle ZENG Shuifei
MENG Yao
LIU Jing
Text classification model based on GNN and attention mechanism
Dianxin kexue
GNN
attention mechanism
graph data
text classification
title Text classification model based on GNN and attention mechanism
title_full Text classification model based on GNN and attention mechanism
title_fullStr Text classification model based on GNN and attention mechanism
title_full_unstemmed Text classification model based on GNN and attention mechanism
title_short Text classification model based on GNN and attention mechanism
title_sort text classification model based on gnn and attention mechanism
topic GNN
attention mechanism
graph data
text classification
url http://www.telecomsci.com/thesisDetails#10.11959/j.issn.1000-0801.2025136
work_keys_str_mv AT zengshuifei textclassificationmodelbasedongnnandattentionmechanism
AT mengyao textclassificationmodelbasedongnnandattentionmechanism
AT liujing textclassificationmodelbasedongnnandattentionmechanism