Graph attention, learning 2-opt algorithm for the traveling salesman problem

Abstract In recent years, deep graph neural networks (GNNs) have been used as solvers or helper functions for the traveling salesman problem (TSP), but they are usually used as encoders to generate static node representations for downstream tasks and are incapable of obtaining the dynamic permutatio...

Full description

Saved in:
Bibliographic Details
Main Authors: Jia Luo, Herui Heng, Geng Wu
Format: Article
Language:English
Published: Springer 2025-01-01
Series:Complex & Intelligent Systems
Subjects:
Online Access:https://doi.org/10.1007/s40747-024-01716-5
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:Abstract In recent years, deep graph neural networks (GNNs) have been used as solvers or helper functions for the traveling salesman problem (TSP), but they are usually used as encoders to generate static node representations for downstream tasks and are incapable of obtaining the dynamic permutational information in completely updating solutions. For addressing this problem, we propose a permutational encoding graph attention encoder and attention-based decoder (PEG2A) model for the TSP that is trained by the advantage actor-critic algorithm. In this work, the permutational encoding graph attention (PEGAT) network is designed to encode node embeddings for gathering information from neighbors and obtaining the dynamic graph permutational information simultaneously. The attention-based decoder is tailored to compute probability distributions over picking pair nodes for 2-opt moves. The experimental results show that our method outperforms the compared learning-based algorithms and traditional heuristic methods.
ISSN:2199-4536
2198-6053