Aspect Sentiment Triplet Extraction with Syntax-Semantics Graph Convolutional Network

Abstract In the traditional task of aspect sentiment triplet extraction, existing approaches typically focus on either syntactic or semantic features independently, failing to leverage the complementary integration of these two types of information. Although graph convolutional network-based approac...

Full description

Saved in:
Bibliographic Details
Main Authors: Jingyun Zhang, Shuwei Xu, Xin Gao, Zhiwei Tang
Format: Article
Language:English
Published: Springer 2025-07-01
Series:International Journal of Computational Intelligence Systems
Subjects:
Online Access:https://doi.org/10.1007/s44196-025-00900-w
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:Abstract In the traditional task of aspect sentiment triplet extraction, existing approaches typically focus on either syntactic or semantic features independently, failing to leverage the complementary integration of these two types of information. Although graph convolutional network-based approaches have demonstrated impressive performance in triplet extraction tasks, they often ignore distance features and semantic information when capturing sentence information. As a result, the integration of syntactic and semantic information remains suboptimal, negatively impacting sentiment analysis performance. To address this limitation, we propose a novel Syntax-Semantics Graph Convolutional Network for aspect sentiment triplet extraction. Our method first extracts syntactic structural information using the probability matrix of dependency trees, from which a mask matrix is constructed based on the varying distances between words. Next, semantic information is captured via a self-attention mechanism and an aspect-attention mechanism, utilizing an attention score matrix. Finally, an interaction module is introduced to effectively integrate syntactic and semantic features. Extensive experiments on several benchmark datasets demonstrate that our approach significantly outperforms existing baselines, achieving an average F1-score improvement of at least 1.083%.
ISSN:1875-6883