Graph neural networks with configuration cross-attention for tensor compilers
With the recent popularity of neural networks comes the need for efficient serving of inference workloads. A neural network inference workload can be represented as a computational graph with nodes as operators transforming multidimensional tensors. The tensors can be transposed and/or tiled in a co...
Saved in:
| Main Authors: | Dmitrii Khizbullin, Eduardo Rocha de Andrade, Thanh Hau Nguyen, Matheus Pedroza Ferreira, David R. Pugh |
|---|---|
| Format: | Article |
| Language: | English |
| Published: |
Frontiers Media S.A.
2025-08-01
|
| Series: | Frontiers in Artificial Intelligence |
| Subjects: | |
| Online Access: | https://www.frontiersin.org/articles/10.3389/frai.2025.1605539/full |
| Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Similar Items
-
Text classification model based on GNN and attention mechanism
by: ZENG Shuifei, et al.
Published: (2025-05-01) -
PhishingGNN: Phishing Email Detection Using Graph Attention Networks and Transformer-Based Feature Extraction
by: Mejdl Safran, et al.
Published: (2025-01-01) -
Low-Rank Tensor Thresholding Ridge Regression
by: Kailing Guo, et al.
Published: (2019-01-01) -
Topology‐aware tensor decomposition for meta‐graph learning
by: Hansi Yang, et al.
Published: (2025-06-01) -
Graph-Attention Diffusion for Enhanced Multivariate Time-Series Anomaly Detection
by: Vadim Lanko, et al.
Published: (2024-01-01)