Integrating Graph Neural Networks and Large Language Models for Stance Detection via Heterogeneous Stance Networks

Stance detection, the task of identifying the stance expressed in a text toward a specific target, is essential for analyzing public opinion across diverse domains. The existing approaches primarily focus on modeling the semantic relationship between the text and target, but they often struggle when...

Full description

Saved in:
Bibliographic Details
Main Authors: Xinyi Chen, Bo Liu, Huaping Hu, Yiqing Cai, Mengmeng Guo, Xingkong Ma
Format: Article
Language:English
Published: MDPI AG 2025-05-01
Series:Applied Sciences
Subjects:
Online Access:https://www.mdpi.com/2076-3417/15/11/5809
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:Stance detection, the task of identifying the stance expressed in a text toward a specific target, is essential for analyzing public opinion across diverse domains. The existing approaches primarily focus on modeling the semantic relationship between the text and target, but they often struggle when the target is implicit or indirectly referenced. In real-world scenarios, stance is frequently conveyed through references to related entities, events, or contextual implications, making stance detection particularly challenging. To tackle this challenge, we propose a novel framework that leverages large language models to construct a heterogeneous stance network from textual data. Based on this network, we develop two complementary methodologies tailored for distinct application scenarios: (1) In a supervised setting, we employ a graph neural network approach to learn stance representations from the heterogeneous stance network, enhancing stance prediction performance. (2) For zero-shot stance detection, we introduce an LLM-based method that leverages the heterogeneous stance network to infer stance without task-specific supervision. The experimental results on benchmark datasets demonstrate that our methods outperform the existing approaches, highlighting their effectiveness in both supervised and zero-shot scenarios.
ISSN:2076-3417