Traffic Forecasting Based on Integration of Adaptive Subgraph Reformulation and Spatio-Temporal Deep Learning Model
Abstract
:1. Introduction
- (1)
- No matter what the forecasting intervals and the traffic condition, most existing models often require the extraction of spatial features on the entire traffic network with fixed nodes and traffic network topology. However, the association relationships among traffic nodes are not invariable.
- (2)
- The traffic forecasting methods based on deep learning models are with the forms of step by step, such as LSTM, Seq2Seq (Sequence to Sequence) model, and Transformer. While dealing with long sequence of traffic data, the prediction accuracy will deteriorate caused by the aggravated propagation of accumulated errors.
- (1)
- An algorithm of adaptive subgraph reformulation is proposed to reduce the impact of irrelevant spatiotemporal information on the forecast accuracy, in which the traffic nodes and network topology in traffic network are selected and reconstructed adaptively based on the reachability analysis and similarity quantization among traffic nodes within a specific forecasting interval.
- (2)
- A spatio-temporal deep learning model with self-attention mechanism is designed to avoid the cumulative error propagation caused by the long sequence of traffic data. The fusion of the time-series information, the spatial information and the time-stamp information sets as the input, and all forecasting results are output by performing one forward operation based on the generative decoder.
2. Methodology
2.1. Adaptive Subgraph Reformulation Algorithm
Algorithm 1 Adaptive subgraph reformulation algorithm |
Require: The actual distances between each traffic nodes; the maximum speed limits in each road segment; the forecasting interval ; the threshold of similarity H.
|
2.2. Spatio-Temporal Deep Learning Model
2.2.1. Embedding Module of Traffic Data
- Scalar: contains the traffic data of each nodes in the reformulated subgraph;
- Network Topology: contains the spatial characteristics derived from the obtained reformulated subgraph;
- Position coding: contains the traffic flow data with the position coding of each nodes in the reformulated subgraph;
- Time Stamp: contains the external information expressed as n-dimensional variable, including the information about year, month, day, hour, minute, week, weather, and holiday.
2.2.2. Self-Attention Mechanism
2.2.3. Encoder–Decoder Component
Algorithm 2 Spatio-temporal deep learning model |
Require: The dataset of traffic flow data; the adjacent matrix G of the reformulated subgraph.
|
3. Experiments
3.1. Dataset
3.2. Experimental Settings and Results
4. Conclusions
Author Contributions
Funding
Data Availability Statement
Conflicts of Interest
References
- Mori, U.; Mendiburu, A. A review of travel time estimation and forecasting for advanced traveller information systems. Transp. A Transp. Sci. 2015, 11, 119–157. [Google Scholar] [CrossRef]
- Chen, L.-W.; Hu, T.-Y. Traffic flow prediction with big data: A deep learning approach. IEEE Trans. Intell. Transp. Syst. 2015, 16, 865–873. [Google Scholar]
- Sanger, T.D. Optimal unsupervised learning in a single-layer linear feedforward neural network. Neural Netw. 1989, 2, 459–473. [Google Scholar] [CrossRef]
- Huang, W.; Song, G. Deep architecture for traffic flow prediction: Deep belief networks with multitask learning. IEEE Trans. Intell. Transp. Syst. 2014, 15, 2191–2201. [Google Scholar] [CrossRef]
- Li, Y.; Yu, R. Diffusion convolutional recurrent neural network: Data-driven traffic forecasting. arXiv 2017, arXiv:1707.01926. [Google Scholar]
- Ma, X.-L.; Tao, Z.-M. Long short-term memory neural network for traffic speed prediction using remote microwave sensor data. Transp. Res. Part C Emerg. Technol. 2015, 54, 187–197. [Google Scholar] [CrossRef]
- Wang, W.; Zhou, C. Cellular traffic load prediction with LSTM and Gaussian process regression. In Proceedings of the 2020 IEEE International Conference on Communications (ICC), Dublin, Ireland, 7–11 June 2020. [Google Scholar]
- Jiang, W.; Zhang, L. Geospatial data to images: A deep-learning framework for traffic forecasting. Tsinghua Sci. Technol. 2019, 24, 52–64. [Google Scholar] [CrossRef]
- Han, L.; Zheng, K.; Zhao, L. Short-term traffic prediction based on deep cluster in large-scale road networks. IEEE Trans. Veh. Technol. 2019, 68, 12301–12313. [Google Scholar] [CrossRef]
- Battaglia, P.W.; Hamrick, J.B.; Bapst, V.; Sanchez-Gonzalez, A.; Zambaldi, V.; Malinowski, M.; Tacchetti, A.; Raposo, D.; Santoro, A.; Faulkner, R.; et al. Relational inductive biases, deep learning, and graph networks. arXiv 2018, arXiv:1806.01261. [Google Scholar]
- Scarselli, F.; Gori, M. The graph neural network model. IEEE Trans. Neural Netw. 2009, 20, 61–80. [Google Scholar] [CrossRef] [Green Version]
- Wu, Z.-H.; Pan, S.-R. A comprehensive survey on graph neural networks. IEEE Trans. Neural Netw. Learn. Syst. 2021, 32, 4–24. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Zhang, J.-W. Graph neural networks for small graph and Giant network representation learning: An overview. arXiv 2019, arXiv:1908.00187. [Google Scholar]
- Guo, K. Optimized graph convolution recurrent neural network for traffic prediction. IEEE Trans. Intell. Transp. Syst. 2019, 22, 1138–1149. [Google Scholar] [CrossRef]
- Cirstea, R.G.; Guo, C.-J. Graph attention recurrent neural networks for correlated time series forecasting. arXiv 2021, arXiv:2103.10760. [Google Scholar]
- Cui, Z.; Henrickson, K.; Ke, R. Traffic graph convolutional recurrent neural network: A deep learning framework for network-scale traffic learning and forecasting. IEEE Trans. Intell. Transp. Syst. 2020, 21, 4883–4894. [Google Scholar] [CrossRef] [Green Version]
- Ke, J.-T.; Feng, S.-Y. Joint predictions of multi-modal ride-hailing demands: A deep multi-task multi-graph learning-based approach. Transp. Res. Part C Emerg. Technol. 2021, 127, 103063. [Google Scholar] [CrossRef]
- Mohanty, S.; Pozdnukhov, A. Region-wide congestion prediction and control using deep learning. Transp. Res. Part C Emerg. Technol. 2020, 116, 102624. [Google Scholar] [CrossRef]
- Leong, H.J.W. The distribution and trend of free speeds on two-lane rural highways in New South Wales. Aust. Road Res. Board Conf. 1968, 4, 791–814. [Google Scholar]
- Katti, B.K.; Raghavachari, S. Modelling of mixed traffic speed data as inputs for the traffic simulation modelsd. Highw. Res. Bull. 1986, 28, 35–48. [Google Scholar]
- Kumar, V.M.; Rao, S.K. Headway and speed studies on two-lane highways. Indian Highw. 1998, 26, 23–26. [Google Scholar]
- Dreyfus, S.E. Dynamic programming and the calculus of variations. J. Math. Anal. Appl. 1960, 1, 228–239. [Google Scholar] [CrossRef] [Green Version]
- Vaswani, A.; Shazeer, N.; Parmar, N.; Uszkoreit, J.; Jones, L.; Gomez, A.N.; Kaiser, Ł.; Polosukhin, I. Attention is all you need. In Proceedings of the Advances in Neural Information Processing Systems, Long Beach, CA, USA, 4–9 December 2017.
- Luong, M.-T.; Pham, H. Effective approaches to attention-based neural machine translation. arXiv 2015, arXiv:1508.04025. [Google Scholar]
- Bahdanau, D.; Cho, K. Neural machine translation by jointly learning to align and translate. arXiv 2016, arXiv:1409.0473. [Google Scholar]
- Hu, J.; Shen, L. Squeeze-and-excitation networks. IEEE Trans. Pattern Anal. Mach. Intell. 2020, 42, 2011–2023. [Google Scholar] [CrossRef] [Green Version]
- Sun, Y.; Fisher, R. Object-based visual attention for computer vision. Artif. Intell. 2003, 146, 77–123. [Google Scholar] [CrossRef] [Green Version]
- Mnih, V.; Heess, N. Recurrent models of visual attention. In Proceedings of the Advances in Neural Information Processing Systems, Montreal, QC, Canada, 8–13 December 2014; Volume 27, pp. 2204–2212. [Google Scholar]
- Lin, Z.; Li, M. Self-attention convlstm for spatiotemporal prediction. In Proceedings of the AAAI Conference on Artificial Intelligence, New York, NY, USA, 7–12 February 2020; Volume 34, pp. 11531–11538. [Google Scholar]
- Zhou, H. Informer: Beyond Efficient Transformer for Long Sequence Time-Series Forecasting. In Proceedings of the AAAI Conference on Artificial Intelligence, Vancouver, BC, Canada, 2–9 February 2021; Volume 35, pp. 11106–11115. [Google Scholar]
- Cui, Z.; Ke, R.; Pu, Z.; Wang, Y. Deep bidirectional and unidirectional lstm recurrent neural network for network-wide traffic speed prediction. arXiv 2018, arXiv:1801.02143. [Google Scholar]
- Agarap, A.F. A neural network architecture combining gated recurrent unit (GRU) and support vector machine (SVM) for intrusion detection in network traffic data. arXiv 2019, arXiv:1709.03082. [Google Scholar]
- Seo, Y.; Defferrard, M.; Vandergheynst, P.; Bresson, X. Structured sequence modeling with graph convolutional recurrent networks. In Proceedings of the International Conference on Neural Information Processing, Siem Reap, Cambodia, 13–16 December 2018; pp. 362–373. [Google Scholar]
- Yu, B.; Yin, H.-T. Spatio-temporal graph convolutional neural network: A deep learning framework for traffic forecasting. arXiv 2017, arXiv:1709.04875. [Google Scholar]
- Wu, Z.-H.; Pan, S.-R. Graph waveNet for deep spatial-temporal graph modeling. arXiv 2019, arXiv:1906.00121. [Google Scholar]
Method | HA | ARIMA | LSTM | GRU | GCRN | Gated-STGCN | GWNET | Ours_WA | Ours_WS | Ours |
---|---|---|---|---|---|---|---|---|---|---|
5.69 | 3.96 | 3.21 | 3.20 | 2.99 | 2.96 | 2.89 | 2.93 | 2.89 | 2.93 | |
7.60 | 6.14 | 4.85 | 4.85 | 4.54 | 4.48 | 4.37 | 4.48 | 4.37 | 4.13 | |
20.02% | 14.12% | 12.85% | 12.82% | 12.11% | 11.85% | 11.49% | 11.88% | 11.89% | 11.84% |
Method | HA | ARIMA | LSTM | GRU | GCRN | Gated-STGCN | GWNET | Ours_WA | Ours_WS | Ours |
---|---|---|---|---|---|---|---|---|---|---|
5.69 | 4.48 | 3.67 | 3.67 | 3.30 | 3.29 | 3.16 | 3.15 | 3.03 | 2.98 | |
7.60 | 6.56 | 5.48 | 5.47 | 4.96 | 4.92 | 4.71 | 3.29 | 3.16 | 4.09 | |
20.02% | 16.10% | 14.92% | 14.86% | 13.53% | 13.33% | 12.52% | 11.58% | 11.47% | 11.38% |
Method | HA | ARIMA | LSTM | GRU | GCRN | Gated-STGCN | GWNET | Ours_WA | Ours_WS | Ours |
---|---|---|---|---|---|---|---|---|---|---|
5.69 | 5.09 | 4.30 | 4.30 | 3.71 | 3.71 | 3.52 | 3.30 | 3.34 | 3.29 | |
7.60 | 7.01 | 6.26 | 6.26 | 5.45 | 5.44 | 5.16 | 5.05 | 4.99 | 4.85 | |
20.02% | 18.24% | 17.38% | 17.41% | 15.13% | 15.03% | 13.77% | 13.25% | 12.63% | 12.36% |
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations. |
© 2022 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Han, S.-Y.; Sun, Q.-W.; Zhao, Q.; Han, R.-Z.; Chen, Y.-H. Traffic Forecasting Based on Integration of Adaptive Subgraph Reformulation and Spatio-Temporal Deep Learning Model. Electronics 2022, 11, 861. https://doi.org/10.3390/electronics11060861
Han S-Y, Sun Q-W, Zhao Q, Han R-Z, Chen Y-H. Traffic Forecasting Based on Integration of Adaptive Subgraph Reformulation and Spatio-Temporal Deep Learning Model. Electronics. 2022; 11(6):861. https://doi.org/10.3390/electronics11060861
Chicago/Turabian StyleHan, Shi-Yuan, Qi-Wei Sun, Qiang Zhao, Rui-Zhi Han, and Yue-Hui Chen. 2022. "Traffic Forecasting Based on Integration of Adaptive Subgraph Reformulation and Spatio-Temporal Deep Learning Model" Electronics 11, no. 6: 861. https://doi.org/10.3390/electronics11060861