An end-to-end attention-based approach for learning on graphs
Abstract There has been a recent surge in transformer-based architectures for learning on graphs, mainly motivated by attention as an effective learning mechanism and the desire to supersede the hand-crafted operators characteristic of message passing schemes. However, concerns over their empirical...
Saved in:
| Main Authors: | David Buterez, Jon Paul Janet, Dino Oglic, Pietro Liò |
|---|---|
| Format: | Article |
| Language: | English |
| Published: |
Nature Portfolio
2025-06-01
|
| Series: | Nature Communications |
| Online Access: | https://doi.org/10.1038/s41467-025-60252-z |
| Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Similar Items
-
AADNet: An End-to-End Deep Learning Model for Auditory Attention Decoding
by: Nhan Duc Thanh Nguyen, et al.
Published: (2025-01-01) -
End-Completely-Regular and End-Inverse Lexicographic Products of Graphs
by: Hailong Hou, et al.
Published: (2014-01-01) -
Non-end-to-end adaptive graph learning for multi-scale temporal traffic flow prediction
by: Kang Xu, et al.
Published: (2025-01-01) -
Non-end-to-end adaptive graph learning for multi-scale temporal traffic flow prediction.
by: Kang Xu, et al.
Published: (2025-01-01) -
An End-to-End Concatenated CNN Attention Model for the Classification of Lung Cancer With XAI Techniques
by: Fariha Haque, et al.
Published: (2025-01-01)