Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                
skip to main content
10.1145/3447548.3467422acmconferencesArticle/Chapter ViewAbstractPublication PageskddConference Proceedingsconference-collections
research-article

Discrete-time Temporal Network Embedding via Implicit Hierarchical Learning in Hyperbolic Space

Published: 14 August 2021 Publication History

Abstract

Representation learning over temporal networks has drawn considerable attention in recent years. Efforts are mainly focused on modeling structural dependencies and temporal evolving regularities in Euclidean space which, however, underestimates the inherent complex and hierarchical properties in many real-world temporal networks, leading to sub-optimal embeddings. To explore these properties of a complex temporal network, we propose a hyperbolic temporal graph network (HTGN) that fully takes advantage of the exponential capacity and hierarchical awareness of hyperbolic geometry. More specially, HTGN maps the temporal graph into hyperbolic space, and incorporates hyperbolic graph neural network and hyperbolic gated recurrent neural network, to capture the evolving behaviors and implicitly preserve hierarchical information simultaneously. Furthermore, in the hyperbolic space, we propose two important modules that enable HTGN to successfully model temporal networks: (1) hyperbolic temporal contextual self-attention (HTA) module to attend to historical states and (2) hyperbolic temporal consistency (HTC) module to ensure stability and generalization. Experimental results on multiple real-world datasets demonstrate the superiority of HTGN for temporal graph embedding, as it consistently outperforms competing methods by significant margins in various temporal link prediction tasks. Specifically, HTGN achieves AUC improvement up to 9.98% for link prediction and 11.4% for new link prediction. Moreover, the ablation study further validates the representational ability of hyperbolic geometry and the effectiveness of the proposed HTA and HTC modules.

Supplementary Material

MP4 File (discretetime_temporal_network_embedding_via-menglin_yang-min_zhou-38957996-bg5Q.mp4)
HTGN: hyperbolic temporal graph network for temporal network/graph embedding. HTGN leverages the power of hyperbolic graph neural networks to capture the underlying hierarchical structure and utilize the hyperbolic gated recurrent neural networks for modeling the evolving patterns in hyperbolic space. Besides, there are two key components to make the framework successfully model the temporal graphs: hyperbolic temporal contextual self-attention (HTA) and hyperbolic temporal consistency (HTC), respectively extract attentive historical states and ensuring stability and generalization.

References

[1]
Charu Aggarwal and Karthik Subbian. 2014. Evolutionary network analysis: A survey. ACM Computing Surveys (CSUR), Vol. 47, 1 (2014), 1--36.
[2]
Miroslav Bacák. 2014. Computing medians and means in Hadamard spaces. SIAM Journal on Optimization, Vol. 24, 3 (2014), 1542--1566.
[3]
Ottar N Bjørnstad, B"arbel F Finkenst"adt, and Bryan T Grenfell. 2002. Dynamics of measles epidemics: estimating scaling of transmission rates using a time series SIR model. Ecological monographs, Vol. 72, 2 (2002), 169--184.
[4]
Michael M Bronstein, Joan Bruna, Yann LeCun, Arthur Szlam, and Pierre Vandergheynst. 2017. Geometric deep learning: going beyond euclidean data. IEEE Signal Processing Magazine, Vol. 34, 4 (2017), 18--42.
[5]
Ines Chami, Zhitao Ying, Christopher Ré, and Jure Leskovec. 2019. Hyperbolic graph convolutional neural networks. In NeurIPS. 4868--4879.
[6]
Xiaofu Chang, Xuqin Liu, Jianfeng Wen, Shuang Li, Yanming Fang, Le Song, and Yuan Qi. 2020. Continuous-Time Dynamic Graph Learning via Neural Interaction Processes. In CIKM. 145--154.
[7]
Kyunghyun Cho, Bart Van Merriënboer, Caglar Gulcehre, Dzmitry Bahdanau, Fethi Bougares, Holger Schwenk, and Yoshua Bengio. 2014. Learning phrase representations using RNN encoder-decoder for statistical machine translation. arXiv preprint arXiv:1406.1078 (2014).
[8]
Qiang Cui, Shu Wu, Yan Huang, and Liang Wang. 2019. A hierarchical contextual attention-based network for sequential recommendation. Neurocomputing, Vol. 358 (2019), 141--149.
[9]
Debidatta Dwibedi, Yusuf Aytar, Jonathan Tompson, Pierre Sermanet, and Andrew Zisserman. 2019. Temporal cycle-consistency learning. In CVPR. 1801--1810.
[10]
Alex Fout, Jonathon Byrd, Basir Shariat, and Asa Ben-Hur. 2017. Protein interface prediction using graph convolutional networks. In NeurIPS. 6530--6539.
[11]
Maurice Fréchet. 1948. Les éléments aléatoires de nature quelconque dans un espace distancié. In Annales de l'institut Henri Poincaré, Vol. 10. 215--310.
[12]
Octavian Ganea, Gary Bécigneul, and Thomas Hofmann. 2018. Hyperbolic neural networks. In NeurIPS. 5345--5355.
[13]
Xavier Glorot and Yoshua Bengio. 2010. Understanding the difficulty of training deep feedforward neural networks. In Proceedings of the thirteenth international conference on artificial intelligence and statistics. JMLR Workshop and Conference Proceedings, 249--256.
[14]
Albert Gu, Frederic Sala, Beliz Gunel, and Christopher Ré. 2019. Learning mixed-curvature representations in product spaces. In ICLR.
[15]
Caglar Gulcehre, Misha Denil, Mateusz Malinowski, Ali Razavi, Razvan Pascanu, Karl Moritz Hermann, Peter Battaglia, Victor Bapst, David Raposo, Adam Santoro, et al. 2019. Hyperbolic attention networks. In ICLR.
[16]
Ehsan Hajiramezanali, Arman Hasanzadeh, Krishna Narayanan, Nick Duffield, Mingyuan Zhou, and Xiaoning Qian. 2019. Variational graph recurrent neural networks. In NeurIPS. 10701--10711.
[17]
Sepp Hochreiter and Jürgen Schmidhuber. 1997. Long short-term memory. Neural computation, Vol. 9, 8 (1997), 1735--1780.
[18]
Edmond Jonckheere, Poonsuk Lohsoonthorn, and Francis Bonahon. 2008. Scaled Gromov hyperbolic graphs. Journal of Graph Theory, Vol. 57, 2 (2008), 157--180.
[19]
Thomas N Kipf and Max Welling. 2016. Variational graph auto-encoders. Bayesian Deep Learning Workshop (NIPS 2016) (2016).
[20]
Thomas N Kipf and Max Welling. 2017. Semi-Supervised Classification with Graph Convolutional Networks. In ICLR.
[21]
Dmitri Krioukov, Fragkiskos Papadopoulos, Maksim Kitsak, Amin Vahdat, and Marián Boguná. 2010. Hyperbolic geometry of complex networks. Physical Review E, Vol. 82, 3 (2010), 036106.
[22]
Jun Liu, Gang Wang, Ping Hu, Ling-Yu Duan, and Alex C Kot. 2017. Global context-aware attention lstm networks for 3d action recognition. In CVPR. 1647--1656.
[23]
Qi Liu, Maximilian Nickel, and Douwe Kiela. 2019 a. Hyperbolic graph neural networks. In NeurIPS. 8230--8241.
[24]
Yozen Liu, Xiaolin Shi, Lucas Pierce, and Xiang Ren. 2019 b. Characterizing and forecasting user engagement with in-app action graph: A case study of snapchat. In KDD. 2023--2031.
[25]
Onuttom Narayan and Iraj Saniee. 2011. Large-scale curvature of networks. Physical Review E, Vol. 84, 6 (2011), 066108.
[26]
Giang Hoang Nguyen, John Boaz Lee, Ryan A Rossi, Nesreen K Ahmed, Eunyee Koh, and Sungchul Kim. 2018. Continuous-time dynamic network embeddings. In WWW. 969--976.
[27]
Maximillian Nickel and Douwe Kiela. 2017. Poincaré embeddings for learning hierarchical representations. In NeurIPS. 6338--6347.
[28]
Maximillian Nickel and Douwe Kiela. 2018. Learning Continuous Hierarchies in the Lorentz Model of Hyperbolic Geometry. In ICML. 3779--3788.
[29]
Aldo Pareja, Giacomo Domeniconi, Jie Chen, Tengfei Ma, Toyotaro Suzumura, Hiroki Kanezashi, Tim Kaler, Tao B Schardl, and Charles E Leiserson. 2020. EvolveGCN: Evolving Graph Convolutional Networks for Dynamic Graphs. In AAAI. 5363--5370.
[30]
Frederic Sala, Chris De Sa, Albert Gu, and Christopher Re. 2018. Representation Tradeoffs for Hyperbolic Embeddings. In ICML. 4460--4469.
[31]
Aravind Sankar, Yanhong Wu, Liang Gou, Wei Zhang, and Hao Yang. 2020. DySAT: Deep Neural Representation Learning on Dynamic Graphs via Self-Attention Networks. In WSDM. 519--527.
[32]
Youngjoo Seo, Michaël Defferrard, Pierre Vandergheynst, and Xavier Bresson. 2018. Structured sequence modeling with graph convolutional recurrent networks. In ICONIP. Springer, 362--373.
[33]
Joakim Skarding, Bogdan Gabrys, and Katarzyna Musial. 2020. Foundations and modelling of dynamic networks using Dynamic Graph Neural Networks: A survey. arXiv preprint arXiv:2005.07496 (2020).
[34]
Rakshit Trivedi, Mehrdad Farajtabar, Prasenjeet Biswal, and Hongyuan Zha. 2019. Dyrep: Learning representations over dynamic graphs. In ICLR.
[35]
Xiaolong Wang, Allan Jabri, and Alexei A Efros. 2019. Learning correspondence from the cycle-consistency of time. In CVPR. 2566--2576.
[36]
Menglin Yang, Ziqiao Meng, and Irwin King. 2020. FeatureNorm: L2 Feature Normalization for Dynamic Graph Embedding. In 2020 IEEE International Conference on Data Mining (ICDM). 731--740. https://doi.org/10.1109/ICDM50108.2020.00082
[37]
Zhitao Ying, Jiaxuan You, Christopher Morris, Xiang Ren, Will Hamilton, and Jure Leskovec. 2018. Hierarchical graph representation learning with differentiable pooling. In NeurIPS. 4800--4810.
[38]
Yiding Zhang, Xiao Wang, Xunqiang Jiang, Chuan Shi, and Yanfang Ye. 2019. Hyperbolic graph attention network. In AAAI.
[39]
Ling Zhao, Yujiao Song, Chao Zhang, Yu Liu, Pu Wang, Tao Lin, Min Deng, and Haifeng Li. 2019. T-gcn: A temporal graph convolutional network for traffic prediction. IEEE Transactions on Intelligent Transportation Systems (2019).
[40]
Shichao Zhu, Shirui Pan, Chuan Zhou, Jia Wu, Yanan Cao, and Bin Wang. 2020. Graph Geometry Interaction Learning. In NeurIPS, Vol. 33.

Cited By

View all
  • (2024)Predicting Long-term Dynamics of Complex Networks via Identifying Skeleton in Hyperbolic SpaceProceedings of the 30th ACM SIGKDD Conference on Knowledge Discovery and Data Mining10.1145/3637528.3671968(1655-1666)Online publication date: 25-Aug-2024
  • (2024)Representation Learning of Temporal Graphs with Structural RolesProceedings of the 30th ACM SIGKDD Conference on Knowledge Discovery and Data Mining10.1145/3637528.3671854(654-665)Online publication date: 25-Aug-2024
  • (2024)Topology-monitorable Contrastive Learning on Dynamic GraphsProceedings of the 30th ACM SIGKDD Conference on Knowledge Discovery and Data Mining10.1145/3637528.3671777(4700-4711)Online publication date: 25-Aug-2024
  • Show More Cited By

Recommendations

Comments

Information & Contributors

Information

Published In

cover image ACM Conferences
KDD '21: Proceedings of the 27th ACM SIGKDD Conference on Knowledge Discovery & Data Mining
August 2021
4259 pages
ISBN:9781450383325
DOI:10.1145/3447548
Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

Sponsors

Publisher

Association for Computing Machinery

New York, NY, United States

Publication History

Published: 14 August 2021

Permissions

Request permissions for this article.

Check for updates

Author Tags

  1. graph neural network
  2. hyperbolic space
  3. representation learning
  4. temporal network

Qualifiers

  • Research-article

Funding Sources

Conference

KDD '21
Sponsor:

Acceptance Rates

Overall Acceptance Rate 1,133 of 8,635 submissions, 13%

Contributors

Other Metrics

Bibliometrics & Citations

Bibliometrics

Article Metrics

  • Downloads (Last 12 months)226
  • Downloads (Last 6 weeks)26
Reflects downloads up to 03 Oct 2024

Other Metrics

Citations

Cited By

View all
  • (2024)Predicting Long-term Dynamics of Complex Networks via Identifying Skeleton in Hyperbolic SpaceProceedings of the 30th ACM SIGKDD Conference on Knowledge Discovery and Data Mining10.1145/3637528.3671968(1655-1666)Online publication date: 25-Aug-2024
  • (2024)Representation Learning of Temporal Graphs with Structural RolesProceedings of the 30th ACM SIGKDD Conference on Knowledge Discovery and Data Mining10.1145/3637528.3671854(654-665)Online publication date: 25-Aug-2024
  • (2024)Topology-monitorable Contrastive Learning on Dynamic GraphsProceedings of the 30th ACM SIGKDD Conference on Knowledge Discovery and Data Mining10.1145/3637528.3671777(4700-4711)Online publication date: 25-Aug-2024
  • (2024)LLM4DyG: Can Large Language Models Solve Spatial-Temporal Problems on Dynamic Graphs?Proceedings of the 30th ACM SIGKDD Conference on Knowledge Discovery and Data Mining10.1145/3637528.3671709(4350-4361)Online publication date: 25-Aug-2024
  • (2024)SEFraud: Graph-based Self-Explainable Fraud Detection via Interpretative Mask LearningProceedings of the 30th ACM SIGKDD Conference on Knowledge Discovery and Data Mining10.1145/3637528.3671534(5329-5338)Online publication date: 25-Aug-2024
  • (2024)VGGM: Variational Graph Gaussian Mixture Model for Unsupervised Change Point Detection in Dynamic NetworksIEEE Transactions on Information Forensics and Security10.1109/TIFS.2024.337754819(4272-4284)Online publication date: 2024
  • (2024)Simplex Pattern Prediction Based on Dynamic Higher Order Path Convolutional NetworksIEEE Transactions on Computational Social Systems10.1109/TCSS.2024.340821411:5(6623-6636)Online publication date: Oct-2024
  • (2024)Leveraging Hyperbolic Dynamic Neural Networks for Knowledge-Aware RecommendationIEEE Transactions on Computational Social Systems10.1109/TCSS.2024.335346711:3(4396-4411)Online publication date: Jun-2024
  • (2024)TimeSGN: Scalable and Effective Temporal Graph Neural Network2024 IEEE 40th International Conference on Data Engineering (ICDE)10.1109/ICDE60146.2024.00255(3297-3310)Online publication date: 13-May-2024
  • (2024)Dynamic Graph Embedding via Self-Attention in the Lorentz Space2024 27th International Conference on Computer Supported Cooperative Work in Design (CSCWD)10.1109/CSCWD61410.2024.10580502(199-204)Online publication date: 8-May-2024
  • Show More Cited By

View Options

Get Access

Login options

View options

PDF

View or Download as a PDF file.

PDF

eReader

View online with eReader.

eReader

Media

Figures

Other

Tables

Share

Share

Share this Publication link

Share on social media