GTAT: empowering graph neural networks with cross attention
Abstract Graph Neural Networks (GNNs) serve as a powerful framework for representation learning on graph-structured data, capturing the information of nodes by recursively aggregating and transforming the neighboring nodes’ representations. Topology in graph plays an important role in learning graph...
Saved in:
Main Authors: | , , , , , |
---|---|
Format: | Article |
Language: | English |
Published: |
Nature Portfolio
2025-02-01
|
Series: | Scientific Reports |
Subjects: | |
Online Access: | https://doi.org/10.1038/s41598-025-88993-3 |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
_version_ | 1823862355016024064 |
---|---|
author | Jiahao Shen Qura Tul Ain Yaohua Liu Banqing Liang Xiaoli Qiang Zheng Kou |
author_facet | Jiahao Shen Qura Tul Ain Yaohua Liu Banqing Liang Xiaoli Qiang Zheng Kou |
author_sort | Jiahao Shen |
collection | DOAJ |
description | Abstract Graph Neural Networks (GNNs) serve as a powerful framework for representation learning on graph-structured data, capturing the information of nodes by recursively aggregating and transforming the neighboring nodes’ representations. Topology in graph plays an important role in learning graph representations and impacts the performance of GNNs. However, current methods fail to adequately integrate topological information into graph representation learning. To better leverage topological information and enhance representation capabilities, we propose the Graph Topology Attention Networks (GTAT). Specifically, GTAT first extracts topology features from the graph’s structure and encodes them into topology representations. Then, the representations of node and topology are fed into cross attention GNN layers for interaction. This integration allows the model to dynamically adjust the influence of node features and topological information, thus improving the expressiveness of nodes. Experimental results on various graph benchmark datasets demonstrate GTAT outperforms recent state-of-the-art methods. Further analysis reveals GTAT’s capability to mitigate the over-smoothing issue, and its increased robustness against noisy data. |
format | Article |
id | doaj-art-f11a0c0b8dfb4195930600e970808f99 |
institution | Kabale University |
issn | 2045-2322 |
language | English |
publishDate | 2025-02-01 |
publisher | Nature Portfolio |
record_format | Article |
series | Scientific Reports |
spelling | doaj-art-f11a0c0b8dfb4195930600e970808f992025-02-09T12:35:26ZengNature PortfolioScientific Reports2045-23222025-02-0115111310.1038/s41598-025-88993-3GTAT: empowering graph neural networks with cross attentionJiahao Shen0Qura Tul Ain1Yaohua Liu2Banqing Liang3Xiaoli Qiang4Zheng Kou5Institute of Computing Science and Technology, Guangzhou UniversityInstitute of Computing Science and Technology, Guangzhou UniversityInstitute of Computing Science and Technology, Guangzhou UniversityInstitute of Computing Science and Technology, Guangzhou UniversitySchool of Computer Science and Cyber Engineering, Guangzhou UniversityInstitute of Computing Science and Technology, Guangzhou UniversityAbstract Graph Neural Networks (GNNs) serve as a powerful framework for representation learning on graph-structured data, capturing the information of nodes by recursively aggregating and transforming the neighboring nodes’ representations. Topology in graph plays an important role in learning graph representations and impacts the performance of GNNs. However, current methods fail to adequately integrate topological information into graph representation learning. To better leverage topological information and enhance representation capabilities, we propose the Graph Topology Attention Networks (GTAT). Specifically, GTAT first extracts topology features from the graph’s structure and encodes them into topology representations. Then, the representations of node and topology are fed into cross attention GNN layers for interaction. This integration allows the model to dynamically adjust the influence of node features and topological information, thus improving the expressiveness of nodes. Experimental results on various graph benchmark datasets demonstrate GTAT outperforms recent state-of-the-art methods. Further analysis reveals GTAT’s capability to mitigate the over-smoothing issue, and its increased robustness against noisy data.https://doi.org/10.1038/s41598-025-88993-3Graph learningGraph neural networksNetwork topologyCross attention mechanism |
spellingShingle | Jiahao Shen Qura Tul Ain Yaohua Liu Banqing Liang Xiaoli Qiang Zheng Kou GTAT: empowering graph neural networks with cross attention Scientific Reports Graph learning Graph neural networks Network topology Cross attention mechanism |
title | GTAT: empowering graph neural networks with cross attention |
title_full | GTAT: empowering graph neural networks with cross attention |
title_fullStr | GTAT: empowering graph neural networks with cross attention |
title_full_unstemmed | GTAT: empowering graph neural networks with cross attention |
title_short | GTAT: empowering graph neural networks with cross attention |
title_sort | gtat empowering graph neural networks with cross attention |
topic | Graph learning Graph neural networks Network topology Cross attention mechanism |
url | https://doi.org/10.1038/s41598-025-88993-3 |
work_keys_str_mv | AT jiahaoshen gtatempoweringgraphneuralnetworkswithcrossattention AT quratulain gtatempoweringgraphneuralnetworkswithcrossattention AT yaohualiu gtatempoweringgraphneuralnetworkswithcrossattention AT banqingliang gtatempoweringgraphneuralnetworkswithcrossattention AT xiaoliqiang gtatempoweringgraphneuralnetworkswithcrossattention AT zhengkou gtatempoweringgraphneuralnetworkswithcrossattention |