Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                
skip to main content
research-article

Graph Sequence Neural Network with an Attention Mechanism for Traffic Speed Prediction

Published: 04 March 2022 Publication History
  • Get Citation Alerts
  • Abstract

    Recent years have witnessed the emerging success of Graph Neural Networks (GNNs) for modeling graphical data. A GNN can model the spatial dependencies of nodes in a graph based on message passing through node aggregation. However, in many application scenarios, these spatial dependencies can change over time, and a basic GNN model cannot capture these changes. In this article, we propose a Graph Sequence neural network with an Attention mechanism (GSeqAtt) for processing graph sequences. More specifically, two attention mechanisms are combined: a horizontal mechanism and a vertical mechanism. GTransformer, which is a horizontal attention mechanism for handling time series, is used to capture the correlations between graphs in the input time sequence. The vertical attention mechanism, a Graph Network (GN) block structure with an attention mechanism (GNAtt), acts within the graph structure in each frame of the time series. Experiments show that our proposed model is able to handle information propagation for graph sequences accurately and efficiently. Moreover, results on real-world data from three road intersections show that our GSeqAtt outperforms state-of-the-art baselines on the traffic speed prediction task.

    References

    [1]
    Junping Zhang, Fei-Yue Wang, Kunfeng Wang, Wei-Hua Lin, Xin Xu, and Cheng Chen. 2011. Data-driven intelligent transportation systems: A survey. IEEE Transactions on Intelligent Transportation Systems 12, 4 (2011), 1624–1639.
    [2]
    Chun-Hsin Wu, Jan-Ming Ho, and Der-Tsai Lee. 2004. Travel-time prediction with support vector regression. IEEE Transactions on Intelligent Transportation Systems 5, 4 (2004), 276–281.
    [3]
    Billy M. Williams and Lester A. Hoel. 2003. Modeling and forecasting vehicular traffic flow as a seasonal ARIMA process: Theoretical basis and empirical results. Journal of Transportation Engineering 129, 6 (2003), 664–672.
    [4]
    Xiaolei Ma, Zhimin Tao, Yinhai Wang, Haiyang Yu, and Yunpeng Wang. 2015. Long short-term memory neural network for traffic speed prediction using remote microwave sensor data. Transportation Research Part C: Emerging Technologies 54 (2015), 187–197.
    [5]
    Xiaolei Ma, Zhuang Dai, Zhengbing He, Jihui Ma, Yong Wang, and Yunpeng Wang. 2017. Learning traffic as images: A deep convolutional neural network for large-scale transportation network speed prediction. Sensors 17, 4 (2017), 818.
    [6]
    Junbo Zhang, Yu Zheng, and Dekang Qi. 2017. Deep spatio-temporal residual networks for citywide crowd flows prediction. In Proceedings of the AAAI Conference on Artificial Intelligence. 1655–1661.
    [7]
    Dong Wang, Junbo Zhang, Wei Cao, Jian Li, and Yu Zheng. 2018. When will you arrive? Estimating travel time based on deep neural networks. In Proceedings of the AAAI Conference on Artificial Intelligence. 1–8.
    [8]
    Bing Yu, Haoteng Yin, and Zhanxing Zhu. 2018. Spatio-temporal graph convolutional Networks: A deep learning framework for traffic forecasting. In Proceedings of the 27th International Joint Conference on Artificial Intelligence. 3634–3640.
    [9]
    Yaguang Li, Rose Yu, Cyrus Shahabi, and Yan Liu. 2018. Diffusion convolutional recurrent neural network: Data-driven traffic forecasting. In International Conference on Learning Representations.
    [10]
    Zonghan Wu, Shirui Pan, Fengwen Chen, Guodong Long, Chengqi Zhang, and S. Yu Philip. 2020. A comprehensive survey on graph neural networks. IEEE Transactions on Neural Networks and Learning Systems 1 (2020), 4–24.
    [11]
    Franco Scarselli, Ah Chung Tsoi, Marco Gori, and Markus Hagenbuchner. 2004. Graphical-based learning environments for pattern recognition. In Joint IAPR International Workshops on Statistical Techniques in Pattern Recognition (SPR) and Structural and Syntactic Pattern Recognition (SSPR). 42–56.
    [12]
    Ozan Irsoy and Claire Cardie. 2014. Deep recursive neural networks for compositionality in language. In Advances in Neural Information Processing Systems. 2096–2104.
    [13]
    Thomas N. Kipf and Max Welling. 2017. Semi-supervised classification with graph convolutional networks. In International Conference on Learning Representations.
    [14]
    Petar Veličković, Guillem Cucurull, Arantxa Casanova, Adriana Romero, Pietro Liò, and Yoshua Bengio. 2018. Graph attention Networks. In International Conference on Learning Representations(accepted as poster).
    [15]
    Peter W. Battaglia, Jessica B. Hamrick, Victor Bapst, Alvaro Sanchez-Gonzalez, Vinicius Zambaldi, Mateusz Malinowski, Andrea Tacchetti, David Raposo, Adam Santoro, Ryan Faulkner, et al. 2018. Relational inductive biases, deep learning, and graph networks. arXiv preprint arXiv:1806.01261 (2018).
    [16]
    Volodymyr Mnih, Nicolas Heess, Alex Graves, et al. 2014. Recurrent models of visual attention. In Advances in Neural Information Processing Systems. 2204–2212.
    [17]
    Kelvin Xu, Jimmy Ba, Ryan Kiros, Kyunghyun Cho, Aaron Courville, Ruslan Salakhudinov, Rich Zemel, and Yoshua Bengio. 2015. Show, attend and tell: Neural image caption generation with visual attention. In International Conference on Machine Learning. 2048–2057.
    [18]
    Hongyang Gao, Zhengyang Wang, and Shuiwang Ji. 2020. Kronecker attention networks. In Proceedings of the 26th ACM SIGKDD International Conference on Knowledge Discovery & Data Mining. 229–237.
    [19]
    Dzmitry Bahdanau, Kyunghyun Cho, and Yoshua Bengio. 2015. Neural machine translation by jointly learning to align and translate. In 3rd International Conference on Learning Representations (ICLR’15), Conference Track Proceedings, Yoshua Bengio and Yann LeCun (Eds.).
    [20]
    Ashish Vaswani, Noam Shazeer, Niki Parmar, Jakob Uszkoreit, Llion Jones, Aidan N. Gomez, Łukasz Kaiser, and Illia Polosukhin. 2017. Attention is all you need. In Advances in Neural Information Processing Systems. 5998–6008.
    [21]
    J. Zhang, Xingjian Shi, Junyuan Xie, Hao Ma, Irwin King, and D. Yeung. 2018. GaAN: Gated attention networks for learning on large and spatiotemporal graphs. In Proceedings of the 34th Conference on Uncertainty in Artificial Intelligence. 339–349.
    [22]
    Shengnan Guo, Youfang Lin, Ning Feng, Chao Song, and Huaiyu Wan. 2019. Attention based spatial-temporal graph convolutional networks for traffic flow forecasting. In Proceedings of the AAAI Conference on Artificial Intelligence. 922–929.
    [23]
    Chuanpan Zheng, Xiaoliang Fan, Cheng Wang, and Jianzhong Qi. 2020. Gman: A graph multi-attention network for traffic prediction. In Proceedings of the AAAI Conference on Artificial Intelligence. 1234–1241.
    [24]
    Haoxing Lin, Rufan Bai, Weijia Jia, Xinyu Yang, and Yongjian You. 2020. Preserving dynamic attention for long-term spatial-temporal prediction. In Proceedings of the 26th ACM SIGKDD International Conference on Knowledge Discovery & Data Mining. 36–46.
    [25]
    Chang Guo, Demin Li, Guanglin Zhang, and Menglin Zhai. 2018. Real-time path planning in urban area via vanet-assisted traffic information sharing. IEEE Transactions on Vehicular Technology 67, 7 (2018), 5635–5649.
    [26]
    Nancy L. Nihan and Kjell O. Holmesland. 1980. Use of the box and Jenkins time series technique in traffic forecasting. Transportation 9, 2 (1980), 125–143.
    [27]
    Yudong Chen, Yi Zhang, and Jianming Hu. 2008. Multi-dimensional traffic flow time series analysis with self-organizing maps. Tsinghua Science & Technology 13, 2 (2008), 220–228.
    [28]
    Bidisha Ghosh, Biswajit Basu, and Margaret O’Mahony. 2009. Multivariate short-term traffic flow forecasting using time-series analysis. IEEE Transactions on Intelligent Transportation Systems 10, 2 (2009), 246.
    [29]
    Shan-Huen Huang and Bin Ran. 2003. An Application of Neural Network on Traffic Speed Prediction under Adverse Weather Condition. Ph.D. Dissertation. University of Wisconsin–Madison.
    [30]
    Alireza Khotanzad and Nayyara Sadek. 2003. Multi-scale high-speed network traffic prediction using combination of neural networks. In Proceedings of the International Joint Conference on Neural Networks, 2003. Vol. 2. IEEE, 1071–1075.
    [31]
    Jingyuan Wang, Qian Gu, Junjie Wu, Guannan Liu, and Zhang Xiong. 2016. Traffic speed prediction and congestion source exploration: A deep learning method. In 2016 IEEE 16th International Conference on Data Mining. 499–508.
    [32]
    Junbo Zhang, Yu Zheng, and Dekang Qi. 2017. Deep spatio-temporal residual networks for citywide crowd flows prediction. In Proceedings of the AAAI Conference on Artificial Intelligence. 1655–1661.
    [33]
    Youngjoo Kim, Peng Wang, Yifei Zhu, and Lyudmila Mihaylova. 2018. A capsule network for traffic speed prediction in complex road Networks. In 2018 Sensor Data Fusion: Trends, Solutions, Applications. 1–6.
    [34]
    Yuxuan Liang, Songyu Ke, Junbo Zhang, Xiuwen Yi, and Yu Zheng. 2018. Geoman: Multi-level attention networks for geo-sensory time series prediction. In Proceedings of the 27th International Joint Conference on Artificial Intelligence. 3428–3434.
    [35]
    Youngjoo Kim, Peng Wang, and Lyudmila Mihaylova. 2019. Structural recurrent neural network for traffic speed prediction. In 2019 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP’19). 5207–5211.
    [36]
    Jie Zhou, Ganqu Cui, Zhengyan Zhang, Cheng Yang, Zhiyuan Liu, and M. Sun. 2018. Graph Neural Networks: A review of methods and applications. ArXiv abs/1812.08434 (2018).
    [37]
    Franco Scarselli, Sweah Liang Yong, Marco Gori, Markus Hagenbuchner, Ah Chung Tsoi, and Marco Maggini. 2005. Graph neural networks for ranking web pages. In The 2005 IEEE/WIC/ACM International Conference on Web Intelligence. 666–672.
    [38]
    Franco Scarselli, Marco Gori, Ah Chung Tsoi, Markus Hagenbuchner, and Gabriele Monfardini. 2008. Computational capabilities of graph neural networks. IEEE Transactions on Neural Networks 20, 1 (2008), 81–102.
    [39]
    Franco Scarselli, Marco Gori, Ah Chung Tsoi, Markus Hagenbuchner, and Gabriele Monfardini. 2008. The graph neural network model. IEEE Transactions on Neural Networks 20, 1 (2008), 61–80.
    [40]
    Yujia Li, Daniel Tarlow, Marc Brockschmidt, and Richard S. Zemel. 2016. Gated graph sequence neural networks. In 4th International Conference on Learning Representations (ICLR’16), Conference Track Proceedings.
    [41]
    Joan Bruna, Wojciech Zaremba, Arthur Szlam, and Yann LeCun. 2014. Spectral networks and locally connected networks on graphs. In 2nd International Conference on Learning Representations (ICLR’14), Conference Track Proceedings.
    [42]
    Justin Gilmer, Samuel S. Schoenholz, Patrick F. Riley, Oriol Vinyals, and George E. Dahl. 2017. Neural message passing for quantum chemistry, InProceedings of Machine Learning Research, Doina Precup and Yee Whye Teh (Eds.), Vol. 70. 1263–1272. http://proceedings.mlr.press/v70/gilmer17a.html.
    [43]
    Zhiyong Cui, Kristian Henrickson, Ruimin Ke, and Yinhai Wang. 2019. Traffic graph convolutional recurrent neural network: A deep learning framework for network-scale traffic learning and forecasting. IEEE Transactions on Intelligent Transportation Systems 21, 11 (2019), 4883–4894.
    [44]
    Thang Luong, Hieu Pham, and Christopher D. Manning. 2015. Effective approaches to attention-based neural machine translation. In Proceedings of the 2015 Conference on Empirical Methods in Natural Language Processing. 1412–1421.
    [45]
    Matthew Veres and Medhat Moussa. 2019. Deep learning for intelligent transportation systems: A survey of emerging trends. IEEE Transactions on Intelligent Transportation Systems (2019), 24 pages.
    [46]
    Xiao Wang, Houye Ji, Chuan Shi, Bai Wang, Yanfang Ye, Peng Cui, and Philip S. Yu. 2019. Heterogeneous graph attention network. In The World Wide Web Conference. 2022–2032.
    [47]
    Xiaoyu Wang, Cailian Chen, Yang Min, Jianping He, Bo Yang, and Yang Zhang. 2018. Efficient metropolitan traffic prediction based on graph recurrent neural Network. arXiv preprint arXiv:1811.00740 (2018).
    [48]
    Ilya Sutskever, Oriol Vinyals, and Quoc V. Le. 2014. Sequence to sequence learning with neural networks. In Advances in Neural Information Processing Systems. 3104–3112.
    [49]
    Junyoung Chung, Caglar Gulcehre, Kyunghyun Cho, and Yoshua Bengio. 2014. Empirical evaluation of gated recurrent neural networks on sequence modeling. In NIPS 2014 Workshop on Deep Learning.
    [50]
    Zonghan Wu, Shirui Pan, Guodong Long, Jing Jiang, and Chengqi Zhang. 2019. Graph wavenet for deep spatial-temporal graph modeling. In Proceedings of the 28th International Joint Conference on Artificial Intelligence. 1907–1913.
    [51]
    Zonghan Wu, Shirui Pan, Guodong Long, Jing Jiang, Xiaojun Chang, and Chengqi Zhang. 2020. Connecting the dots: Multivariate time series forecasting with graph neural networks. In Proceedings of the 26th ACM SIGKDD International Conference on Knowledge Discovery & Data Mining. 753–763.
    [52]
    Lei Bai, Lina Yao, Can Li, Xianzhi Wang, and Can Wang. 2020. Adaptive graph convolutional recurrent network for traffic forecasting. In Advances in Neural Information Processing Systems.

    Cited By

    View all
    • (2024)Anomaly Detection in Medical Time Series with Generative Adversarial Networks: A Selective ReviewAnomaly Detection - Recent Advances, AI and ML Perspectives and Applications10.5772/intechopen.112582Online publication date: 17-Jan-2024
    • (2024)Score-based Graph Learning for Urban Flow PredictionACM Transactions on Intelligent Systems and Technology10.1145/365562915:3(1-25)Online publication date: 17-May-2024
    • (2024)Optimizing Urban Traffic Flow Prediction: Integrating Spatial–Temporal Analysis with a Hybrid GNN and Gated-Attention GRU ModelSmart Data Intelligence10.1007/978-981-97-3191-6_29(381-391)Online publication date: 28-Jul-2024
    • Show More Cited By

    Index Terms

    1. Graph Sequence Neural Network with an Attention Mechanism for Traffic Speed Prediction

      Recommendations

      Comments

      Information & Contributors

      Information

      Published In

      cover image ACM Transactions on Intelligent Systems and Technology
      ACM Transactions on Intelligent Systems and Technology  Volume 13, Issue 2
      April 2022
      392 pages
      ISSN:2157-6904
      EISSN:2157-6912
      DOI:10.1145/3508464
      • Editor:
      • Huan Liu
      Issue’s Table of Contents

      Publisher

      Association for Computing Machinery

      New York, NY, United States

      Publication History

      Published: 04 March 2022
      Accepted: 01 June 2021
      Revised: 01 May 2021
      Received: 01 November 2020
      Published in TIST Volume 13, Issue 2

      Permissions

      Request permissions for this article.

      Check for updates

      Author Tags

      1. Graph neural network
      2. self-attention
      3. traffic speed prediction

      Qualifiers

      • Research-article
      • Refereed

      Funding Sources

      • National Key Technology Support Program
      • Guangxi Innovation-Driven Development Special Fund Project
      • National Natural Science Foundation of China
      • Beijing Municipal Science and Technology Project

      Contributors

      Other Metrics

      Bibliometrics & Citations

      Bibliometrics

      Article Metrics

      • Downloads (Last 12 months)394
      • Downloads (Last 6 weeks)27
      Reflects downloads up to 10 Aug 2024

      Other Metrics

      Citations

      Cited By

      View all
      • (2024)Anomaly Detection in Medical Time Series with Generative Adversarial Networks: A Selective ReviewAnomaly Detection - Recent Advances, AI and ML Perspectives and Applications10.5772/intechopen.112582Online publication date: 17-Jan-2024
      • (2024)Score-based Graph Learning for Urban Flow PredictionACM Transactions on Intelligent Systems and Technology10.1145/365562915:3(1-25)Online publication date: 17-May-2024
      • (2024)Optimizing Urban Traffic Flow Prediction: Integrating Spatial–Temporal Analysis with a Hybrid GNN and Gated-Attention GRU ModelSmart Data Intelligence10.1007/978-981-97-3191-6_29(381-391)Online publication date: 28-Jul-2024
      • (2023)Isomorphic Graph Embedding for Progressive Maximal Frequent Subgraph MiningACM Transactions on Intelligent Systems and Technology10.1145/363063515:1(1-26)Online publication date: 19-Dec-2023
      • (2023)Faithful and Consistent Graph Neural Network Explanations with Rationale AlignmentACM Transactions on Intelligent Systems and Technology10.1145/361654214:5(1-23)Online publication date: 9-Oct-2023
      • (2023)Graph Neural Rough Differential Equations for Traffic ForecastingACM Transactions on Intelligent Systems and Technology10.1145/360480814:4(1-27)Online publication date: 21-Jul-2023
      • (2023)Diffuse and Smooth: Beyond Truncated Receptive Field for Scalable and Adaptive Graph Representation LearningACM Transactions on Knowledge Discovery from Data10.1145/357278117:5(1-25)Online publication date: 27-Feb-2023
      • (2023)Dynamic Multi-View Graph Neural Networks for Citywide Traffic InferenceACM Transactions on Knowledge Discovery from Data10.1145/356475417:4(1-22)Online publication date: 24-Feb-2023
      • (2023)Spatial–Temporal Traffic Modeling With a Fusion Graph Reconstructed by Tensor DecompositionIEEE Transactions on Intelligent Transportation Systems10.1109/TITS.2023.331413425:2(1749-1760)Online publication date: 22-Sep-2023
      • (2023)Adaptive self-propagation graph convolutional network for recommendationWorld Wide Web10.1007/s11280-023-01182-y26:5(3183-3206)Online publication date: 1-Jul-2023
      • Show More Cited By

      View Options

      Get Access

      Login options

      Full Access

      View options

      PDF

      View or Download as a PDF file.

      PDF

      eReader

      View online with eReader.

      eReader

      Full Text

      View this article in Full Text.

      Full Text

      HTML Format

      View this article in HTML Format.

      HTML Format

      Media

      Figures

      Other

      Tables

      Share

      Share

      Share this Publication link

      Share on social media