Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                
skip to main content
10.1145/3583780.3615487acmconferencesArticle/Chapter ViewAbstractPublication PagescikmConference Proceedingsconference-collections
research-article

Monotonic Neural Ordinary Differential Equation: Time-series Forecasting for Cumulative Data

Published: 21 October 2023 Publication History

Abstract

Time-Series Forecasting based on Cumulative Data (TSFCD) is a crucial problem in decision-making across various industrial scenarios. However, existing time-series forecasting methods often overlook two important characteristics of cumulative data, namely monotonicity and irregularity, which limit their practical applicability. To address this limitation, we propose a principled approach called Monotonic neural Ordinary Differential Equation (MODE) within the framework of neural ordinary differential equations. By leveraging MODE, we are able to effectively capture and represent the monotonicity and irregularity in practical cumulative data. Through extensive experiments conducted in a bonus allocation scenario, we demonstrate that MODE outperforms state-of-the-art methods, showcasing its ability to handle both monotonicity and irregularity in cumulative data and delivering superior forecasting performance.

Supplementary Material

MP4 File (MODE.mp4)
Presentation Video for paper with titile ``Monotonic Neural Ordinary Differential Equation: Time-series Forecasting for Cumulative Data"

References

[1]
Alexander Alexandrov, Konstantinos Benidis, Michael Bohlke-Schneider, Valentin Flunkert, Jan Gasthaus, Tim Januschowski, Danielle C. Maddix, Syama Rangapuram, David Salinas, Jasper Schulz, Lorenzo Stella, Ali Caner Türkmen, and Yuyang Wang. 2020. GluonTS: Probabilistic and Neural Time Series Modeling in Python. Journal of Machine Learning Research, Vol. 21, 116 (2020), 1--6. http://jmlr.org/papers/v21/19--820.html
[2]
Shaojie Bai, J Zico Kolter, and Vladlen Koltun. 2018. An empirical evaluation of generic convolutional and recurrent networks for sequence modeling. arXiv preprint arXiv:1803.01271 (2018).
[3]
John Charles Butcher. 2016. Numerical methods for ordinary differential equations. John Wiley & Sons.
[4]
Zhengping Che, Sanjay Purushotham, Kyunghyun Cho, David Sontag, and Yan Liu. 2018. Recurrent neural networks for multivariate time series with missing values. Scientific reports, Vol. 8, 1 (2018), 1--12.
[5]
Ricky T. Q. Chen, Yulia Rubanova, Jesse Bettencourt, and David K Duvenaud. 2018. Neural Ordinary Differential Equations. In Advances in Neural Information Processing Systems, S. Bengio, H. Wallach, H. Larochelle, K. Grauman, N. Cesa-Bianchi, and R. Garnett (Eds.), Vol. 31. Curran Associates, Inc., 1--13. https://proceedings.neurips.cc/paper/2018/file/69386f6bb1dfed68692a24c8686939b9-Paper.pdf
[6]
Junyoung Chung, Caglar Gulcehre, KyungHyun Cho, and Yoshua Bengio. 2014. Empirical evaluation of gated recurrent neural networks on sequence modeling. arXiv preprint arXiv:1412.3555 (2014).
[7]
Harald Cramer. 1961. Stochastic Processes. In Proceedings of the Fourth Berkeley symposium on mathematical statistics and probability, Vol. 4. Univ of California Press, 57.
[8]
Patrick Kidger, James Morrill, James Foster, and Terry Lyons. 2020. Neural controlled differential equations for irregular time series. Advances in Neural Information Processing Systems, Vol. 33 (2020), 6696--6707.
[9]
Diederik P Kingma and Jimmy Ba. 2014. Adam: A method for stochastic optimization. arXiv preprint arXiv:1412.6980 (2014).
[10]
Shiyang Li, Xiaoyong Jin, Yao Xuan, Xiyou Zhou, Wenhu Chen, Yu-Xiang Wang, and Xifeng Yan. 2019. Enhancing the locality and breaking the memory bottleneck of transformer on time series forecasting. Advances in neural information processing systems, Vol. 32 (2019).
[11]
Lev Semenovich Pontryagin, EF Mishchenko, VG Boltyanskii, and RV Gamkrelidze. 1962. The mathematical theory of optimal processes.
[12]
Yulia Rubanova, Ricky T. Q. Chen, and David K Duvenaud. 2019. Latent Ordinary Differential Equations for Irregularly-Sampled Time Series. In Advances in Neural Information Processing Systems, Vol. 32. Curran Associates, Inc. https://proceedings.neurips.cc/paper/2019/file/42a6845a557bef704ad8ac9cb4461d43-Paper.pdf
[13]
Ilya Sutskever, Oriol Vinyals, and Quoc V Le. 2014. Sequence to sequence learning with neural networks. Advances in neural information processing systems, Vol. 27 (2014).
[14]
Sean J Taylor and Benjamin Letham. 2018. Forecasting at scale. The American Statistician, Vol. 72, 1 (2018), 37--45.
[15]
Ashish Vaswani, Noam Shazeer, Niki Parmar, Jakob Uszkoreit, Llion Jones, Aidan N Gomez, Łukasz Kaiser, and Illia Polosukhin. 2017. Attention is all you need. Advances in neural information processing systems, Vol. 30 (2017).
[16]
Qingsong Wen, Tian Zhou, Chaoli Zhang, Weiqi Chen, Ziqing Ma, Junchi Yan, and Liang Sun. 2022. Transformers in time series: A survey. arXiv preprint arXiv:2202.07125 (2022).
[17]
Haixu Wu, Jiehui Xu, Jianmin Wang, and Mingsheng Long. 2021. Autoformer: Decomposition transformers with auto-correlation for long-term series forecasting. Advances in Neural Information Processing Systems, Vol. 34 (2021), 22419--22430.
[18]
Chuanfa Xiao, Wenxue Han, Weiming Shao, and Dongya Zhao. 2023. Distributed Semisupervised HMM for Dynamic Inferential Sensor Development. IEEE Sensors Journal, Vol. 23, 3 (2023), 2737--2749. https://doi.org/10.1109/JSEN.2022.3230980
[19]
Haoyi Zhou, Shanghang Zhang, Jieqi Peng, Shuai Zhang, Jianxin Li, Hui Xiong, and Wancai Zhang. 2021. Informer: Beyond Efficient Transformer for Long Sequence Time-Series Forecasting. In The Thirty-Fifth AAAI Conference on Artificial Intelligence, AAAI 2021, Virtual Conference, Vol. 35. 11106--11115.
[20]
Tian Zhou, Ziqing Ma, Qingsong Wen, Xue Wang, Liang Sun, and Rong Jin. 2022. FEDformer: Frequency Enhanced Decomposed Transformer for Long-term Series Forecasting. In Proceedings of the 39th International Conference on Machine Learning (Proceedings of Machine Learning Research, Vol. 162). PMLR, 27268--27286. https://proceedings.mlr.press/v162/zhou22g.html
[21]
Yunyi Zhou, Zhixuan Chu, Yijia Ruan, Ge Jin, Yuchen Huang, and Sheng Li. 2023. pTSE: A Multi-model Ensemble Method for Probabilistic Time Series Forecasting. arXiv preprint arXiv:2305.11304 (2023).

Cited By

View all
  • (2024)Multiscale Representation Enhanced Temporal Flow Fusion Model for Long-Term Workload ForecastingProceedings of the 33rd ACM International Conference on Information and Knowledge Management10.1145/3627673.3680072(4948-4956)Online publication date: 21-Oct-2024
  • (2024)An Accurate and Interpretable Framework for Trustworthy Process MonitoringIEEE Transactions on Artificial Intelligence10.1109/TAI.2023.33196065:5(2241-2252)Online publication date: May-2024
  • (2024)Hidformer: Hierarchical dual-tower transformer using multi-scale mergence for long-term time series forecastingExpert Systems with Applications10.1016/j.eswa.2023.122412239(122412)Online publication date: Apr-2024
  • Show More Cited By

Recommendations

Comments

Information & Contributors

Information

Published In

cover image ACM Conferences
CIKM '23: Proceedings of the 32nd ACM International Conference on Information and Knowledge Management
October 2023
5508 pages
ISBN:9798400701245
DOI:10.1145/3583780
Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than the author(s) must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected].

Sponsors

Publisher

Association for Computing Machinery

New York, NY, United States

Publication History

Published: 21 October 2023

Permissions

Request permissions for this article.

Check for updates

Author Tags

  1. cumulative time-series
  2. neural ordinary differential equation
  3. time-series forecasting

Qualifiers

  • Research-article

Conference

CIKM '23
Sponsor:

Acceptance Rates

Overall Acceptance Rate 1,861 of 8,427 submissions, 22%

Upcoming Conference

CIKM '25

Contributors

Other Metrics

Bibliometrics & Citations

Bibliometrics

Article Metrics

  • Downloads (Last 12 months)161
  • Downloads (Last 6 weeks)13
Reflects downloads up to 09 Nov 2024

Other Metrics

Citations

Cited By

View all
  • (2024)Multiscale Representation Enhanced Temporal Flow Fusion Model for Long-Term Workload ForecastingProceedings of the 33rd ACM International Conference on Information and Knowledge Management10.1145/3627673.3680072(4948-4956)Online publication date: 21-Oct-2024
  • (2024)An Accurate and Interpretable Framework for Trustworthy Process MonitoringIEEE Transactions on Artificial Intelligence10.1109/TAI.2023.33196065:5(2241-2252)Online publication date: May-2024
  • (2024)Hidformer: Hierarchical dual-tower transformer using multi-scale mergence for long-term time series forecastingExpert Systems with Applications10.1016/j.eswa.2023.122412239(122412)Online publication date: Apr-2024
  • (2024)Attempt of Graph Neural Network Algorithm in the Field of Financial Anomaly DetectionProceedings of the 2nd International Conference on Internet of Things, Communication and Intelligent Technology10.1007/978-981-97-2757-5_65(616-623)Online publication date: 26-Apr-2024
  • (2023)Temporal Attention Convolutional Neural Networks Based on LSTM-Encoder for Time Series Forecasting2023 International Conference on Networks, Communications and Intelligent Computing (NCIC)10.1109/NCIC61838.2023.00014(51-54)Online publication date: 17-Nov-2023
  • (2023)Research on Graph Neural Network Algorithms for Financial Anomaly Detection2023 International Conference on Networks, Communications and Intelligent Computing (NCIC)10.1109/NCIC61838.2023.00009(18-23)Online publication date: 17-Nov-2023

View Options

Get Access

Login options

View options

PDF

View or Download as a PDF file.

PDF

eReader

View online with eReader.

eReader

Media

Figures

Other

Tables

Share

Share

Share this Publication link

Share on social media