Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                
skip to main content
10.1145/3570991.3571002acmotherconferencesArticle/Chapter ViewAbstractPublication PagescodsConference Proceedingsconference-collections
short-paper

Event Uncertainty using Ensemble Neural Hawkes Process

Published: 04 January 2023 Publication History

Abstract

Various real world applications in science and industry are often recorded over time as asynchronous event sequences. These event sequences comprise of the time of occurrence of events. Different applications including such event sequences are crime analysis, earthquake prediction, neural spiking train study, infectious disease prediction etc. A principled framework for modeling asynchronous event sequences is temporal point process. Recent works on neural temporal point process have combined the theoretical foundation of point process with universal approximation ability of neural networks. However, the predictions made by these models are uncertain due to incorrect model inference. Therefore, it is highly desirable to associate uncertainty with the predictions as well. In this paper, we propose a novel model, Ensemble Neural Hawkes Process, which is capable of predicting event occurrence time along with uncertainty, hence improving the generalization capability. We also propose evaluation metric which captures the uncertainty modelling capability for event prediction. The efficacy of proposed model is demonstrated using various simulated and real world datasets.

References

[1]
Ahmed Alaa and Mihaela Van Der Schaar. 2020. Discriminative jackknife: Quantifying uncertainty in deep learning via higher-order influence functions. In International Conference on Machine Learning. PMLR, 165–174.
[2]
Zeyuan Allen-Zhu and Yuanzhi Li. 2020. Towards understanding ensemble, knowledge distillation and self-distillation in deep learning. arXiv preprint arXiv:2012.09816(2020).
[3]
Emmanuel Bacry, Iacopo Mastromatteo, and Jean-François Muzy. 2015. Hawkes processes in finance. Market Microstructure and Liquidity 1, 01 (2015), 1550005.
[4]
Paidamoyo Chapfuwa, Chenyang Tao, Chunyuan Li, Irfan Khan, Karen J Chandross, Michael J Pencina, Lawrence Carin, and Ricardo Henao. 2020. Calibration and Uncertainty in Neural Time-to-Event Modeling. IEEE Transactions on Neural Networks and Learning Systems (2020).
[5]
Wen-Hao Chiang, Xueying Liu, and George Mohler. 2021. Hawkes process modeling of COVID-19 with mobility leading indicators and spatial covariates. International journal of forecasting(2021).
[6]
Francesco D’Angelo and Vincent Fortuin. 2021. Repulsive deep ensembles are bayesian. Advances in Neural Information Processing Systems 34 (2021), 3451–3465.
[7]
Peter Diggle, Barry Rowlingson, and Ting-li Su. 2005. Point process methodology for on-line spatio-temporal disease surveillance. Environmetrics: The official journal of the International Environmetrics Society 16, 5 (2005), 423–434.
[8]
Paul Embrechts, Thomas Liniger, and Lu Lin. 2011. Multivariate Hawkes processes: an application to financial data. Journal of Applied Probability 48, A (2011), 367–378.
[9]
Yarin Gal. 2016. Uncertainty in deep learning. Ph. D. Dissertation. PhD thesis, University of Cambridge.
[10]
Chuan Guo, Geoff Pleiss, Yu Sun, and Kilian Q Weinberger. 2017. On calibration of modern neural networks. In International conference on machine learning. PMLR, 1321–1330.
[11]
Sebastian Hainzl, D Steacy, and S Marsan. 2010. Seismicity models based on Coulomb stress calculations. Community Online Resource for Statistical Seismicity Analysis (2010).
[12]
Alan G Hawkes. 1971. Spectra of some self-exciting and mutually exciting point processes. Biometrika 58, 1 (1971), 83–90.
[13]
Balaji Lakshminarayanan, Alexander Pritzel, and Charles Blundell. 2017. Simple and Scalable Predictive Uncertainty Estimation using Deep Ensembles. Advances in Neural Information Processing Systems 30 (2017).
[14]
George O Mohler, Martin B Short, P Jeffrey Brantingham, Frederic Paik Schoenberg, and George E Tita. 2011. Self-exciting point process modeling of crime. J. Amer. Statist. Assoc. 106, 493 (2011), 100–108.
[15]
Giung Nam, Jongmin Yoon, Yoonho Lee, and Juho Lee. 2021. Diversity matters when learning from ensembles. Advances in Neural Information Processing Systems 34 (2021), 8367–8377.
[16]
Radford M Neal. 2012. Bayesian learning for neural networks. Vol. 118. Springer Science & Business Media.
[17]
Takahiro Omi, Kazuyuki Aihara, 2019. Fully Neural Network based Model for General Temporal Point Processes. Advances in Neural Information Processing Systems 32 (2019), 2122–2132.
[18]
Tim Pearce, Alexandra Brintrup, Mohamed Zaki, and Andy Neely. 2018. High-quality prediction intervals for deep learning: A distribution-free, ensembled approach. In International Conference on Machine Learning. PMLR, 4075–4084.
[19]
Alexandre Ramé, Rémy Sun, and Matthieu Cord. 2021. Mixmo: Mixing multiple inputs for multiple outputs via deep subnetworks. In Proceedings of the IEEE/CVF International Conference on Computer Vision. 823–833.
[20]
Esko Valkeila. 2008. An Introduction to the Theory of Point Processes, Volume II: General Theory and Structure, by Daryl J. Daley, David Vere-Jones.
[21]
Haoyun Wang, Liyan Xie, Alex Cuozzo, Simon Mak, and Yao Xie. 2020. Uncertainty Quantification for Inferring Hawkes Networks. Advances in Neural Information Processing Systems 33 (2020).
[22]
Florian Wenzel, Jasper Snoek, Dustin Tran, and Rodolphe Jenatton. 2020. Hyperparameter ensembles for robustness and uncertainty quantification. Advances in Neural Information Processing Systems 33 (2020), 6514–6527.
[23]
Andrew G Wilson and Pavel Izmailov. 2020. Bayesian deep learning and a probabilistic perspective of generalization. Advances in neural information processing systems 33 (2020), 4697–4708.
[24]
Zhiliang Wu, Yinchong Yang, Peter A Fashing, and Volker Tresp. 2021. Uncertainty-Aware Time-to-Event Prediction using Deep Kernel Accelerated Failure Time Models. In Machine Learning for Healthcare Conference. PMLR, 54–79.
[25]
Hongteng Xu, Dixin Luo, Xu Chen, and Lawrence Carin. 2018. Benefits from superposed hawkes processes. In International Conference on Artificial Intelligence and Statistics. PMLR, 623–631.
[26]
Shingo Yashima, Teppei Suzuki, Kohta Ishikawa, Ikuro Sato, and Rei Kawakami. 2022. Feature Space Particle Inference for Neural Network Ensembles. arXiv preprint arXiv:2206.00944(2022).
[27]
Jize Zhang, Bhavya Kailkhura, and T Yong-Jin Han. 2020. Mix-n-match: Ensemble and compositional methods for uncertainty calibration in deep learning. In International conference on machine learning. PMLR, 11117–11128.
[28]
R Zhang, C Walder, MA Rizoiu, and L Xie. 2019. Efficient non-parametric Bayesian hawkes processes. In IJCAI International Joint Conference on Artificial Intelligence.
[29]
Rui Zhang, Christian Walder, and Marian-Andrei Rizoiu. 2020. Variational inference for sparse gaussian process modulated hawkes process. In Proceedings of the AAAI Conference on Artificial Intelligence, Vol. 34. 6803–6810.

Recommendations

Comments

Information & Contributors

Information

Published In

cover image ACM Other conferences
CODS-COMAD '23: Proceedings of the 6th Joint International Conference on Data Science & Management of Data (10th ACM IKDD CODS and 28th COMAD)
January 2023
357 pages
ISBN:9781450397971
DOI:10.1145/3570991
Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

Publisher

Association for Computing Machinery

New York, NY, United States

Publication History

Published: 04 January 2023

Permissions

Request permissions for this article.

Check for updates

Author Tags

  1. Event modeling
  2. Hawkes process
  3. Time-to-event prediction

Qualifiers

  • Short-paper
  • Research
  • Refereed limited

Conference

CODS-COMAD 2023

Acceptance Rates

Overall Acceptance Rate 197 of 680 submissions, 29%

Contributors

Other Metrics

Bibliometrics & Citations

Bibliometrics

Article Metrics

  • 0
    Total Citations
  • 94
    Total Downloads
  • Downloads (Last 12 months)27
  • Downloads (Last 6 weeks)4
Reflects downloads up to 08 Feb 2025

Other Metrics

Citations

View Options

Login options

View options

PDF

View or Download as a PDF file.

PDF

eReader

View online with eReader.

eReader

HTML Format

View this article in HTML Format.

HTML Format

Figures

Tables

Media

Share

Share

Share this Publication link

Share on social media