Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                
skip to main content
10.5555/3666122.3668307guideproceedingsArticle/Chapter ViewAbstractPublication PagesnipsConference Proceedingsconference-collections
research-article

Automatic integration for spatiotemporal neural point processes

Published: 30 May 2024 Publication History

Abstract

Learning continuous-time point processes is essential to many discrete event forecasting tasks. However, integration poses a major challenge, particularly for spatiotemporal point processes (STPPs), as it involves calculating the likelihood through triple integrals over space and time. Existing methods for integrating STPP either assume a parametric form of the intensity function, which lacks flexibility; or approximating the intensity with Monte Carlo sampling, which introduces numerical errors. Recent work by Omi et al. [2019] proposes a dual network approach for efficient integration of flexible intensity function. However, their method only focuses on the 1D temporal point process. In this paper, we introduce a novel paradigm: AutoSTPP (Automatic Integration for Spatiotemporal Neural Point Processes) that extends the dual network approach to 3D STPP. While previous work provides a foundation, its direct extension overly restricts the intensity function and leads to computational challenges. In response, we introduce a decomposable parametrization for the integral network using ProdNet. This approach, leveraging the product of simplified univariate graphs, effectively sidesteps the computational complexities inherent in multivariate computational graphs. We prove the consistency of AutoSTPP and validate it on synthetic data and benchmark real-world datasets. AutoSTPP shows a significant advantage in recovering complex intensity functions from irregular spatiotemporal events, particularly when the intensity is sharply localized. Our code is open-source at https://github.com/Rose-STL-Lab/AutoSTPP.

References

[1]
Ryan Prescott Adams, Iain Murray, and David JC MacKay. Tractable nonparametric bayesian inference in poisson processes with gaussian process intensities. In Proceedings of the 26th annual international conference on machine learning, pages 9-16, 2009.
[2]
Anders Brix and Peter J Diggle. Spatiotemporal prediction for log-gaussian cox processes. Journal of the Royal Statistical Society: Series B (Statistical Methodology), 63(4):823-841, 2001.
[3]
Anders Brix and Jesper Moller. Space-time multi type log gaussian cox processes with a view to modelling weeds. Scandinavian Journal of Statistics, 28(3):471-488, 2001.
[4]
Ricky TQ Chen, Brandon Amos, and Maximilian Nickel. Neural spatio-temporal point processes. arXiv preprint arXiv:2011.04583, 2020.
[5]
Yuanda Chen. Thinning algorithms for simulating point processes. Florida State University, Tallahassee, FL, 2016.
[6]
Daryl J Daley and David Vere-Jones. An introduction to the theory of point processes: volume II: general theory and structure. Springer Science & Business Media, 2007.
[7]
Hennie Daniels and Marina Velikova. Monotone and partially monotone neural networks. IEEE Transactions on Neural Networks, 21(6):906-917, 2010.
[8]
Philip J Davis and Philip Rabinowitz. Methods of numerical integration. Courier Corporation, 2007.
[9]
Nan Du, Hanjun Dai, Rakshit Trivedi, Utkarsh Upadhyay, Manuel Gomez-Rodriguez, and Le Song. Recurrent marked temporal point processes: Embedding event history to vector. In Proceedings of the 22nd ACM SIGKDD international conference on knowledge discovery and data mining, pages 1555-1564, 2016.
[10]
William Dunham. The calculus gallery. Princeton University Press, 2018.
[11]
Ruocheng Guo, Jundong Li, and Huan Liu. Initiator: Noise-contrastive estimation for marked temporal point process. In IJCAI, pages 2191-2197, 2018.
[12]
Hengguan Huang, Hao Wang, and Brian Mak. Recurrent poisson process unit for speech recognition. In Proceedings of the AAAI Conference on Artificial Intelligence, volume 33, pages 6538-6545, 2019.
[13]
Haibin Li, Yangtian Li, and Shangjie Li. Dual neural network method for solving multiple definite integrals. Neural computation, 31(1):208-232, 2019.
[14]
Shuang Li, Shuai Xiao, Shixiang Zhu, Nan Du, Yao Xie, and Le Song. Learning temporal point processes via reinforcement learning. arXiv preprint arXiv:1811.05016, 2018.
[15]
David B Lindell, Julien NP Martel, and Gordon Wetzstein. Autoint: Automatic integration for fast neural volume rendering. In Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pages 14556-14565, 2021.
[16]
Keqin Liu. Automatic integration. arXiv e-prints, pages arXiv-2006, 2020.
[17]
Jerrold E. Marsden and Anthony Tromba. Vector Calculus. Macmillan, August 2003. ISBN 978-0-7167-4992-9. Google-Books-ID: LiRLJf2m_dwC.
[18]
Hongyuan Mei and Jason Eisner. The neural hawkes process: A neurally self-modulating multivariate point process. arXiv preprint arXiv:1612.09328, 2016.
[19]
Yosihiko Ogata et al. The asymptotic behaviour of maximum likelihood estimators for stationary point processes. Annals of the Institute of Statistical Mathematics, 30(1):243-261, 1978.
[20]
Takahiro Omi, Naonori Ueda, and Kazuyuki Aihara. Fully neural network based model for general temporal point processes. arXiv preprint arXiv:1905.09690, 2019.
[21]
Alex Reinhart. A review of self-exciting spatio-temporal point processes and their applications. Statistical Science, 33(3):299-318, 2018.
[22]
Robert H Risch. The problem of integration in finite terms. Transactions of the American Mathematical Society, 139:167-189, 1969.
[23]
Robert H Risch. The solution of the problem of integration in finite terms. Bulletin of the American Mathematical Society, 76(3):605-608, 1970.
[24]
Jin Shang and Mingxuan Sun. Geometric hawkes processes with graph convolutional recurrent neural networks. In Proceedings of the AAAI Conference on Artificial Intelligence, volume 33, pages 4878-4885, 2019.
[25]
Oleksandr Shchur, Marin Biloš, and Stephan Günnemann. Intensity-free learning of temporal point processes. arXiv preprint arXiv:1909.12127, 2019.
[26]
Utkarsh Upadhyay, Abir De, and Manuel Gomez-Rodriguez. Deep reinforcement learning of marked temporal point processes. arXiv preprint arXiv:1805.09360, 2018.
[27]
Shuai Xiao, Mehrdad Farajtabar, Xiaojing Ye, Junchi Yan, Le Song, and Hongyuan Zha. Wasserstein learning of deep generative point process models. arXiv preprint arXiv:1705.08051, 2017.
[28]
Junchi Yan, Xin Liu, Liangliang Shi, Changsheng Li, and Hongyuan Zha. Improving maximum likelihood estimation of temporal point process via discriminative and adversarial learning. In IJCAI, pages 2948-2954, 2018.
[29]
Qiang Zhang, Aldo Lipani, Omer Kirnap, and Emine Yilmaz. Self-attentive hawkes process. In International Conference on Machine Learning, pages 11183-11193. PMLR, 2020.
[30]
Zihao Zhou and Rose Yu. Automatic integration for fast and interpretable neural point process. In Learning for Dynamics and Control Conference. PMLR, 2023.
[31]
Zihao Zhou, Xingyi Yang, Ryan Rossi, Handong Zhao, and Rose Yu. Neural point process for learning spatiotemporal event dynamics. In Learning for Dynamics and Control Conference, pages 777-789. PMLR, 2022.

Recommendations

Comments

Information & Contributors

Information

Published In

cover image Guide Proceedings
NIPS '23: Proceedings of the 37th International Conference on Neural Information Processing Systems
December 2023
80772 pages

Publisher

Curran Associates Inc.

Red Hook, NY, United States

Publication History

Published: 30 May 2024

Qualifiers

  • Research-article
  • Research
  • Refereed limited

Contributors

Other Metrics

Bibliometrics & Citations

Bibliometrics

Article Metrics

  • 0
    Total Citations
  • 0
    Total Downloads
  • Downloads (Last 12 months)0
  • Downloads (Last 6 weeks)0
Reflects downloads up to 04 Oct 2024

Other Metrics

Citations

View Options

View options

Media

Figures

Other

Tables

Share

Share

Share this Publication link

Share on social media