LDC-GAT: A Lyapunov-Stable Graph Attention Network with Dynamic Filtering and Constraint-Aware Optimization

Graph attention networks are pivotal for modeling non-Euclidean data, yet they face dual challenges: training oscillations induced by projection-based high-dimensional constraints and gradient anomalies due to poor adaptation to heterophilic structure. To address these issues, we propose LDC-GAT (Ly...

Full description

Saved in:
Bibliographic Details
Main Authors: Liping Chen, Hongji Zhu, Shuguang Han
Format: Article
Language:English
Published: MDPI AG 2025-06-01
Series:Axioms
Subjects:
Online Access:https://www.mdpi.com/2075-1680/14/7/504
Tags: Add Tag
No Tags, Be the first to tag this record!
_version_ 1849714372672225280
author Liping Chen
Hongji Zhu
Shuguang Han
author_facet Liping Chen
Hongji Zhu
Shuguang Han
author_sort Liping Chen
collection DOAJ
description Graph attention networks are pivotal for modeling non-Euclidean data, yet they face dual challenges: training oscillations induced by projection-based high-dimensional constraints and gradient anomalies due to poor adaptation to heterophilic structure. To address these issues, we propose LDC-GAT (Lyapunov-Stable Graph Attention Network with Dynamic Filtering and Constraint-Aware Optimization), which jointly optimizes both forward and backward propagation processes. In the forward path, we introduce Dynamic Residual Graph Filtering, which integrates a tunable self-loop coefficient to balance neighborhood aggregation and self-feature retention. This filtering mechanism, constrained by a lower bound on Dirichlet energy, improves multi-head attention via multi-scale fusion and mitigates overfitting. In the backward path, we design the Fro-FWNAdam, a gradient descent algorithm guided by a learning-rate-aware perceptron. An explicit Frobenius norm bound on weights is derived from Lyapunov theory to form the basis of the perceptron. This stability-aware optimizer is embedded within a Frank–Wolfe framework with Nesterov acceleration, yielding a projection-free constrained optimization strategy that stabilizes training dynamics. Experiments on six benchmark datasets show that LDC-GAT outperforms GAT by 10.54% in classification accuracy, which demonstrates strong robustness on heterophilic graphs.
format Article
id doaj-art-48fc505cd86b446ba02f1c2e7fc76655
institution DOAJ
issn 2075-1680
language English
publishDate 2025-06-01
publisher MDPI AG
record_format Article
series Axioms
spelling doaj-art-48fc505cd86b446ba02f1c2e7fc766552025-08-20T03:13:43ZengMDPI AGAxioms2075-16802025-06-0114750410.3390/axioms14070504LDC-GAT: A Lyapunov-Stable Graph Attention Network with Dynamic Filtering and Constraint-Aware OptimizationLiping Chen0Hongji Zhu1Shuguang Han2School of Science, Zhejiang Sci-Tech University, Hangzhou 310018, ChinaSchool of Computer Science, Zhejiang Sci-Tech University, Hangzhou 310018, ChinaSchool of Science, Zhejiang Sci-Tech University, Hangzhou 310018, ChinaGraph attention networks are pivotal for modeling non-Euclidean data, yet they face dual challenges: training oscillations induced by projection-based high-dimensional constraints and gradient anomalies due to poor adaptation to heterophilic structure. To address these issues, we propose LDC-GAT (Lyapunov-Stable Graph Attention Network with Dynamic Filtering and Constraint-Aware Optimization), which jointly optimizes both forward and backward propagation processes. In the forward path, we introduce Dynamic Residual Graph Filtering, which integrates a tunable self-loop coefficient to balance neighborhood aggregation and self-feature retention. This filtering mechanism, constrained by a lower bound on Dirichlet energy, improves multi-head attention via multi-scale fusion and mitigates overfitting. In the backward path, we design the Fro-FWNAdam, a gradient descent algorithm guided by a learning-rate-aware perceptron. An explicit Frobenius norm bound on weights is derived from Lyapunov theory to form the basis of the perceptron. This stability-aware optimizer is embedded within a Frank–Wolfe framework with Nesterov acceleration, yielding a projection-free constrained optimization strategy that stabilizes training dynamics. Experiments on six benchmark datasets show that LDC-GAT outperforms GAT by 10.54% in classification accuracy, which demonstrates strong robustness on heterophilic graphs.https://www.mdpi.com/2075-1680/14/7/504LDC-GATDRG-FilteringFro-FWNAdammulti-head weight threshold
spellingShingle Liping Chen
Hongji Zhu
Shuguang Han
LDC-GAT: A Lyapunov-Stable Graph Attention Network with Dynamic Filtering and Constraint-Aware Optimization
Axioms
LDC-GAT
DRG-Filtering
Fro-FWNAdam
multi-head weight threshold
title LDC-GAT: A Lyapunov-Stable Graph Attention Network with Dynamic Filtering and Constraint-Aware Optimization
title_full LDC-GAT: A Lyapunov-Stable Graph Attention Network with Dynamic Filtering and Constraint-Aware Optimization
title_fullStr LDC-GAT: A Lyapunov-Stable Graph Attention Network with Dynamic Filtering and Constraint-Aware Optimization
title_full_unstemmed LDC-GAT: A Lyapunov-Stable Graph Attention Network with Dynamic Filtering and Constraint-Aware Optimization
title_short LDC-GAT: A Lyapunov-Stable Graph Attention Network with Dynamic Filtering and Constraint-Aware Optimization
title_sort ldc gat a lyapunov stable graph attention network with dynamic filtering and constraint aware optimization
topic LDC-GAT
DRG-Filtering
Fro-FWNAdam
multi-head weight threshold
url https://www.mdpi.com/2075-1680/14/7/504
work_keys_str_mv AT lipingchen ldcgatalyapunovstablegraphattentionnetworkwithdynamicfilteringandconstraintawareoptimization
AT hongjizhu ldcgatalyapunovstablegraphattentionnetworkwithdynamicfilteringandconstraintawareoptimization
AT shuguanghan ldcgatalyapunovstablegraphattentionnetworkwithdynamicfilteringandconstraintawareoptimization