Traffic State Estimation With Spatio-Temporal Autoencoding Transformer (STAT Model)
Traffic state estimation is an essential component of Intelligent Transportation System (ITS) designed for alleviating traffic congestion. As traffic data is composed of intricate information which can also be impacted by various factors, scholars have been attempting to utilize state-of-the-art dee...
Saved in:
| Main Authors: | , , |
|---|---|
| Format: | Article |
| Language: | English |
| Published: |
IEEE
2025-01-01
|
| Series: | IEEE Access |
| Subjects: | |
| Online Access: | https://ieeexplore.ieee.org/document/11003092/ |
| Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
| Summary: | Traffic state estimation is an essential component of Intelligent Transportation System (ITS) designed for alleviating traffic congestion. As traffic data is composed of intricate information which can also be impacted by various factors, scholars have been attempting to utilize state-of-the-art deep learning forecasting models in recent years. However, a more complex and robust model is required to extract long-range correlations with large-scale traffic data sequences. In order to overcome the weaknesses of deep learning models, the superior performance of transformers is expected to address this effectively in time-series forecasting with transport data. Employing the capabilities of transformers in extracting long-term trends and dynamic dependencies, the proposed model improves the deep learning prediction performance for real datasets. The findings indicate that the transformer-based model exhibited promising performance in forecasting long-term traffic patterns and characteristics with a large quantity of data. In this paper, a comparison across conventional hybrid deep learning models with the Spatio-Temporal Autoencoder Transformer (STAT) model was conducted using real-world datasets. The multi-head attention-based transformer model outperformed all other comparative approaches for large-scale data demonstrating its importance in measuring the error criteria. Comprehensive evaluations across various traffic prediction datasets have established the validity of the proposed approach. Also, for an efficient model selection, Akaike Information Criterion (AIC), Schwarz Bayesian Information Criterion (SBIC), Hannan-Quinn Information Criterion (HQIC), and corrected AIC (AICc) tools were used to evaluate and compare models based on their ability to balance fit and complexity. The proposed model has shown improvements in RMSE for the STREET I980 and PEMSBAY datasets of 1.03 and 0.27, respectively. In RMSE for 60 minutes, for the same datasets, the improvements were 0.84 for STREET I980 and 0.56 for PEMS-BAY. These findings underscore the potential of transformer-based models in enhancing the performance of traffic prediction systems. |
|---|---|
| ISSN: | 2169-3536 |