Fusing multiplex heterogeneous networks using graph attention-aware fusion networks

Abstract Graph Neural Networks (GNN) emerged as a deep learning framework to generate node and graph embeddings for downstream machine learning tasks. Popular GNN-based architectures operate on networks of single node and edge type. However, a large number of real-world networks include multiple typ...

Full description

Saved in:
Bibliographic Details
Main Authors: Ziynet Nesibe Kesimoglu, Serdar Bozdag
Format: Article
Language:English
Published: Nature Portfolio 2024-11-01
Series:Scientific Reports
Subjects:
Online Access:https://doi.org/10.1038/s41598-024-78555-4
Tags: Add Tag
No Tags, Be the first to tag this record!
_version_ 1850216077040025600
author Ziynet Nesibe Kesimoglu
Serdar Bozdag
author_facet Ziynet Nesibe Kesimoglu
Serdar Bozdag
author_sort Ziynet Nesibe Kesimoglu
collection DOAJ
description Abstract Graph Neural Networks (GNN) emerged as a deep learning framework to generate node and graph embeddings for downstream machine learning tasks. Popular GNN-based architectures operate on networks of single node and edge type. However, a large number of real-world networks include multiple types of nodes and edges. Enabling these architectures to work on networks with multiple node and edge types brings additional challenges due to the heterogeneity of the networks and the multiplicity of the existing associations. In this study, we present a framework, named GRAF (Graph Attention-aware Fusion Networks), to convert multiplex heterogeneous networks to homogeneous networks to make them more suitable for graph representation learning. Using attention-based neighborhood aggregation, GRAF learns the importance of each neighbor per node (called node-level attention) followed by the importance of each network layer (called network layer-level attention). Then, GRAF processes a network fusion step weighing each edge according to the learned attentions. After an edge elimination step based on edge weights, GRAF utilizes Graph Convolutional Networks (GCN) on the fused network and incorporates node features on graph-structured data for a node classification or a similar downstream task. To demonstrate GRAF’s generalizability, we applied it to four datasets from different domains and observed that GRAF outperformed or was on par with the baselines and state-of-the-art (SOTA) methods. We were able to interpret GRAF’s findings utilizing the attention weights. Source code for GRAF is publicly available at https://github.com/bozdaglab/GRAF .
format Article
id doaj-art-896caf108117458b9d06e650404052f2
institution OA Journals
issn 2045-2322
language English
publishDate 2024-11-01
publisher Nature Portfolio
record_format Article
series Scientific Reports
spelling doaj-art-896caf108117458b9d06e650404052f22025-08-20T02:08:24ZengNature PortfolioScientific Reports2045-23222024-11-0114111110.1038/s41598-024-78555-4Fusing multiplex heterogeneous networks using graph attention-aware fusion networksZiynet Nesibe Kesimoglu0Serdar Bozdag1Department of Computer Science and Engineering, University of North TexasDepartment of Computer Science and Engineering, University of North TexasAbstract Graph Neural Networks (GNN) emerged as a deep learning framework to generate node and graph embeddings for downstream machine learning tasks. Popular GNN-based architectures operate on networks of single node and edge type. However, a large number of real-world networks include multiple types of nodes and edges. Enabling these architectures to work on networks with multiple node and edge types brings additional challenges due to the heterogeneity of the networks and the multiplicity of the existing associations. In this study, we present a framework, named GRAF (Graph Attention-aware Fusion Networks), to convert multiplex heterogeneous networks to homogeneous networks to make them more suitable for graph representation learning. Using attention-based neighborhood aggregation, GRAF learns the importance of each neighbor per node (called node-level attention) followed by the importance of each network layer (called network layer-level attention). Then, GRAF processes a network fusion step weighing each edge according to the learned attentions. After an edge elimination step based on edge weights, GRAF utilizes Graph Convolutional Networks (GCN) on the fused network and incorporates node features on graph-structured data for a node classification or a similar downstream task. To demonstrate GRAF’s generalizability, we applied it to four datasets from different domains and observed that GRAF outperformed or was on par with the baselines and state-of-the-art (SOTA) methods. We were able to interpret GRAF’s findings utilizing the attention weights. Source code for GRAF is publicly available at https://github.com/bozdaglab/GRAF .https://doi.org/10.1038/s41598-024-78555-4Attention aware network fusionGraph neural networksDrug ADR prediction
spellingShingle Ziynet Nesibe Kesimoglu
Serdar Bozdag
Fusing multiplex heterogeneous networks using graph attention-aware fusion networks
Scientific Reports
Attention aware network fusion
Graph neural networks
Drug ADR prediction
title Fusing multiplex heterogeneous networks using graph attention-aware fusion networks
title_full Fusing multiplex heterogeneous networks using graph attention-aware fusion networks
title_fullStr Fusing multiplex heterogeneous networks using graph attention-aware fusion networks
title_full_unstemmed Fusing multiplex heterogeneous networks using graph attention-aware fusion networks
title_short Fusing multiplex heterogeneous networks using graph attention-aware fusion networks
title_sort fusing multiplex heterogeneous networks using graph attention aware fusion networks
topic Attention aware network fusion
Graph neural networks
Drug ADR prediction
url https://doi.org/10.1038/s41598-024-78555-4
work_keys_str_mv AT ziynetnesibekesimoglu fusingmultiplexheterogeneousnetworksusinggraphattentionawarefusionnetworks
AT serdarbozdag fusingmultiplexheterogeneousnetworksusinggraphattentionawarefusionnetworks