Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                
skip to main content
10.1145/3442381.3449945acmconferencesArticle/Chapter ViewAbstractPublication PagesthewebconfConference Proceedingsconference-collections
research-article

Radflow: A Recurrent, Aggregated, and Decomposable Model for Networks of Time Series

Published: 03 June 2021 Publication History

Abstract

We propose a new model for networks of time series that influence each other. Graph structures among time series are found in diverse domains, such as web traffic influenced by hyperlinks, product sales influenced by recommendation, or urban transport volume influenced by road networks and weather. There has been recent progress in graph modeling and in time series forecasting, respectively, but an expressive and scalable approach for a network of series does not yet exist. We introduce Radflow, a novel model that embodies three key ideas: a recurrent neural network to obtain node embeddings that depend on time, the aggregation of the flow of influence from neighboring nodes with multi-head attention, and the multi-layer decomposition of time series. Radflow naturally takes into account dynamic networks where nodes and edges change over time, and it can be used for prediction and data imputation tasks. On real-world datasets ranging from a few hundred to a few hundred thousand nodes, we observe that Radflow variants are the best performing model across a wide range of settings. The recurrent component in Radflow also outperforms N-BEATS, the state-of-the-art time series model. We show that Radflow can learn different trends and seasonal patterns, that it is robust to missing nodes and edges, and that correlated temporal patterns among network neighbors reflect influence strength. We curate WikiTraffic, the largest dynamic network of time series with 366K nodes and 22M time-dependent links spanning five years. This dataset provides an open benchmark for developing models in this area, with applications that include optimizing resources for the web. More broadly, Radflow has the potential to improve forecasts in correlated time series networks such as the stock market, and impute missing measurements in geographically dispersed networks of natural phenomena.

References

[1]
George Athanasopoulos, Rob Hyndman, Haiyan Song, and Doris C. Wu. 2011. The tourism forecasting competition. International Journal of Forecasting 27, 3 (2011), 822–844.
[2]
G. E. P. Box and G. M Jenkins. 1970. Time series analysis: Forecasting and control. San Francisco: Holden-Day.
[3]
Robert Goodell Brown. 1959. Statistical forecasting for inventory control. McGraw/Hill.
[4]
Deng Cai and Wai Lam. 2020. Graph Transformer for Graph-to-Sequence Learning. In AAAI. 7464–7471.
[5]
Cristian Consonni, David Laniado, and Alberto Montresor. 2019. WikiLinkGraphs: A complete, longitudinal and multi-language dataset of the Wikipedia link networks. In Proceedings of the International AAAI Conference on Web and Social Media, Vol. 13. 598–607.
[6]
Vasant Dhar, Tomer Geva, Gal Oestreicher-Singer, and Arun Sundararajan. 2014. Prediction in economic networks. Information Systems Research 25, 2 (2014), 264–284.
[7]
Aditya Grover and Jure Leskovec. 2016. node2vec: Scalable feature learning for networks. In Proceedings of the 22nd ACM SIGKDD international conference on Knowledge discovery and data mining. 855–864.
[8]
Will Hamilton, Zhitao Ying, and Jure Leskovec. 2017. Inductive representation learning on large graphs. In Advances in neural information processing systems. 1024–1034.
[9]
Dan Hendrycks and Kevin Gimpel. 2016. Gaussian Error Linear Units (GELUs).arXiv: Learning (2016).
[10]
CC Holt. 1957. Forecasting seasonals and trends by exponentially weighted averages(O.N.R. Memorandum No. 52). Carnegie Institute of Technology(1957).
[11]
R.J. Hyndman and G. Athanasopoulos. 2018. Forecasting: principles and practice(2nd editio ed.). OTexts: Melbourne, Australia. OTexts.com/fpp2
[12]
Mirko Kämpf, Eric Tessenow, Dror Y Kenett, and Jan W Kantelhardt. 2015. The detection of emerging trends using Wikipedia traffic data and context networks. PloS one 10, 12 (2015), e0141892.
[13]
Diederik P. Kingma and Jimmy Ba. 2015. Adam: A Method for Stochastic Optimization. In International Conference on Learning Representations.
[14]
Thomas N. Kipf and Max Welling. 2017. Semi-Supervised Classification with Graph Convolutional Networks. In International Conference on Learning Representations (ICLR).
[15]
Ilya Loshchilov and Frank Hutter. 2019. Decoupled Weight Decay Regularization. In International Conference on Learning Representations.
[16]
Spyros Makridakis and Michele Hibon. 2000. The M3-Competition: results, conclusions and implications. International Journal of Forecasting 16, 4 (2000), 451–476.
[17]
Spyros Makridakis, Evangelos Spiliotis, and Vassilios Assimakopoulos. 2018. The M4 Competition: Results, findings, conclusion and way forward. International Journal of Forecasting 34, 4 (2018), 802–808.
[18]
Franco Manessi, Alessandro Rozza, and Mario Manzo. 2020. Dynamic Graph Convolutional Networks. Pattern Recognit. 97(2020).
[19]
Swapnil Mishra, Marian-Andrei Rizoiu, and Lexing Xie. 2016. Feature driven and point process approaches for popularity prediction. In Proceedings of the 25th ACM international on conference on information and knowledge management. 1069–1078.
[20]
Gal Oestreicher-Singer, Barak Libai, Liron Sivan, Eyal Carmi, and Ohad Yassin. 2013. The network value of products. Journal of Marketing 77, 3 (2013), 1–14.
[21]
Boris N. Oreshkin, Dmitri Carpov, Nicolas Chapados, and Yoshua Bengio. 2020. N-BEATS: Neural basis expansion analysis for interpretable time series forecasting. In International Conference on Learning Representations.
[22]
Aldo Pareja, Giacomo Domeniconi, Jian Jhen Chen, Tengfei Ma, Toyotaro Suzumura, Hiroki Kanezashi, Tim Kaler, and Charles E. Leisersen. 2019. EvolveGCN: Evolving Graph Convolutional Networks for Dynamic Graphs. ArXiv abs/1902.10191(2019).
[23]
Adam Paszke, Sam Gross, Soumith Chintala, Gregory Chanan, Edward Yang, Zachary DeVito, Zeming Lin, Alban Desmaison, Luca Antiga, and Adam Lerer. 2017. Automatic Differentiation in PyTorch. In NIPS Autodiff Workshop.
[24]
Bryan Perozzi, Rami Al-Rfou, and Steven Skiena. 2014. DeepWalk: Online Learning of Social Representations. In Proceedings of the 20th ACM SIGKDD International Conference on Knowledge Discovery and Data Mining (New York, New York, USA) (KDD ’14). Association for Computing Machinery, New York, NY, USA, 701–710.
[25]
N. Petluri and E. Al-Masri. 2018. Web Traffic Prediction of Wikipedia Pages. In 2018 IEEE International Conference on Big Data (Big Data). 5427–5429.
[26]
Benedek Rozemberczki, Carl Allen, and Rik Sarkar. 2019. Multi-scale Attributed Node Embedding. arxiv:1909.13021 [cs.LG]
[27]
Aravind Sankar, Yanhong Wu, Liang Gou, Wei Zhang, and Hao Yang. 2020. DySAT: Deep Neural Representation Learning on Dynamic Graphs via Self-Attention Networks. In Proceedings of the 13th International Conference on Web Search and Data Mining. 519–527.
[28]
Minjeong Shin, Alasdair Tran, Siqi Wu, Alexander Mathews, Rong Wang, Georgiana Lyall, and Lexing Xie. 2021. AttentionFlow: Visualising Influence in Networks of Time Series. In The 14th International Conference on Web Search and Data Mining, Demo(WSDM ’21).
[29]
Jessica Su, Aneesh Sharma, and Sharad Goel. 2016. The effect of recommendations on network structure. In Proceedings of the 25th international conference on World Wide Web. 1157–1167.
[30]
Gabor Szabo and Bernardo A Huberman. 2010. Predicting the popularity of online content. Commun. ACM 53, 8 (2010), 80–88.
[31]
Alasdair Tran, Alexander Mathews, Cheng Soon Ong, and Lexing Xie. 2021. Radflow: A Recurrent, Aggregated, and Decomposable Model for Networks of Time Series — Supplementary Materials. https://github.com/alasdairtran/radflow.
[32]
Rakshit Trivedi, Hanjun Dai, Yichen Wang, and Le Song. 2017. Know-evolve: deep temporal reasoning for dynamic knowledge graphs. In Proceedings of the 34th International Conference on Machine Learning-Volume 70. 3462–3471.
[33]
Rakshit Trivedi, Mehrdad Farajtabar, Prasenjeet Biswal, and Hongyuan Zha. 2019. Dyrep: Learning representations over dynamic graphs. In International Conference on Learning Representations.
[34]
Ashish Vaswani, Noam Shazeer, Niki Parmar, Jakob Uszkoreit, Llion Jones, Aidan N. Gomez, Lukasz Kaiser, and Illia Polosukhin. 2017. Attention is All you Need. ArXiv abs/1706.03762(2017).
[35]
Petar Veličković, Guillem Cucurull, Arantxa Casanova, Adriana Romero, Pietro Liò, and Yoshua Bengio. 2018. Graph Attention Networks. In International Conference on Learning Representations.
[36]
Peter R Winters. 1960. Forecasting sales by exponentially weighted moving averages. Management science 6, 3 (1960), 324–342.
[37]
Neo Z. Wu, Bradley A. Green, Xue Ben, and Shawn O’Banion. 2020. Deep Transformer Models for Time Series Forecasting: The Influenza Prevalence Case. ArXiv abs/2001.08317(2020).
[38]
Siqi Wu, Marian-Andrei Rizoiu, and Lexing Xie. 2019. Estimating Attention Flow in Online Video Networks. Proc. ACM Hum.-Comput. Interact. 3, CSCW, Article 183 (Nov. 2019), 25 pages. https://doi.org/10.1145/3359285
[39]
Kun Xu, Lingfei Wu, Zhiguo Wang, Yansong Feng, Michael Witbrock, and Vadim Sheinin. 2018. Graph2seq: Graph to sequence learning with attention-based neural networks. arXiv preprint arXiv:1804.00823(2018).
[40]
Ling Zhao, Yujiao Song, Chao Zhang, Yu Liu, Pu Wang, Tao Lin, Min Deng, and Haifeng Li. 2019. T-GCN: A Temporal Graph Convolutional Network for Traffic Prediction. IEEE Transactions on Intelligent Transportation Systems (2019), 1–11.
[41]
Kai Zhu, Dylan Walker, and Lev Muchnik. 2020. Content Growth and Attention Contagion in Information Networks: Addressing Information Poverty on Wikipedia. Information Systems Research(2020).
[42]
Lingxue Zhu and Nikolay Laptev. 2017. Deep and Confident Prediction for Time Series at Uber. 2017 IEEE International Conference on Data Mining Workshops (ICDMW) (2017), 103–110.

Cited By

View all
  • (2024)STDNet: A Spatio-Temporal Decomposition Neural Network for Multivariate Time Series ForecastingTsinghua Science and Technology10.26599/TST.2023.901010529:4(1232-1247)Online publication date: Aug-2024
  • (2024)Graph Time-series Modeling in Deep Learning: A SurveyACM Transactions on Knowledge Discovery from Data10.1145/363853418:5(1-35)Online publication date: 28-Feb-2024
  • (2024)Integrating System State into Spatio Temporal Graph Neural Network for Microservice Workload PredictionProceedings of the 30th ACM SIGKDD Conference on Knowledge Discovery and Data Mining10.1145/3637528.3671508(5521-5531)Online publication date: 25-Aug-2024
  • Show More Cited By
  1. Radflow: A Recurrent, Aggregated, and Decomposable Model for Networks of Time Series

    Recommendations

    Comments

    Information & Contributors

    Information

    Published In

    cover image ACM Conferences
    WWW '21: Proceedings of the Web Conference 2021
    April 2021
    4054 pages
    ISBN:9781450383127
    DOI:10.1145/3442381
    Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

    Sponsors

    Publisher

    Association for Computing Machinery

    New York, NY, United States

    Publication History

    Published: 03 June 2021

    Permissions

    Request permissions for this article.

    Check for updates

    Author Tags

    1. graphs
    2. networks
    3. sequence models
    4. time series
    5. wikipedia

    Qualifiers

    • Research-article
    • Research
    • Refereed limited

    Conference

    WWW '21
    Sponsor:
    WWW '21: The Web Conference 2021
    April 19 - 23, 2021
    Ljubljana, Slovenia

    Acceptance Rates

    Overall Acceptance Rate 1,899 of 8,196 submissions, 23%

    Contributors

    Other Metrics

    Bibliometrics & Citations

    Bibliometrics

    Article Metrics

    • Downloads (Last 12 months)34
    • Downloads (Last 6 weeks)0
    Reflects downloads up to 30 Aug 2024

    Other Metrics

    Citations

    Cited By

    View all
    • (2024)STDNet: A Spatio-Temporal Decomposition Neural Network for Multivariate Time Series ForecastingTsinghua Science and Technology10.26599/TST.2023.901010529:4(1232-1247)Online publication date: Aug-2024
    • (2024)Graph Time-series Modeling in Deep Learning: A SurveyACM Transactions on Knowledge Discovery from Data10.1145/363853418:5(1-35)Online publication date: 28-Feb-2024
    • (2024)Integrating System State into Spatio Temporal Graph Neural Network for Microservice Workload PredictionProceedings of the 30th ACM SIGKDD Conference on Knowledge Discovery and Data Mining10.1145/3637528.3671508(5521-5531)Online publication date: 25-Aug-2024
    • (2024)Contextually enhanced ES-dRNN with dynamic attention for short-term load forecastingNeural Networks10.1016/j.neunet.2023.11.017169:C(660-672)Online publication date: 4-Mar-2024
    • (2024)TIENExpert Systems with Applications: An International Journal10.1016/j.eswa.2023.121403236:COnline publication date: 1-Feb-2024
    • (2023)HDResNet: Hierarchical-Decomposition Residual Network for Hierarchical Time Series Forecasting2023 International Joint Conference on Neural Networks (IJCNN)10.1109/IJCNN54540.2023.10191455(1-8)Online publication date: 18-Jun-2023
    • (2023)TSFRN: Integrated Time and Spatial-Frequency domain based on Triple-links Residual Network for Sales Forecasting2023 IEEE 35th International Conference on Tools with Artificial Intelligence (ICTAI)10.1109/ICTAI59109.2023.00152(1012-1019)Online publication date: 6-Nov-2023
    • (2022)SRI-EEG: State-Based Recurrent Imputation for EEG Artifact CorrectionFrontiers in Computational Neuroscience10.3389/fncom.2022.80338416Online publication date: 20-May-2022

    View Options

    Get Access

    Login options

    View options

    PDF

    View or Download as a PDF file.

    PDF

    eReader

    View online with eReader.

    eReader

    HTML Format

    View this article in HTML Format.

    HTML Format

    Media

    Figures

    Other

    Tables

    Share

    Share

    Share this Publication link

    Share on social media