Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                
skip to main content
10.1145/3637528.3671875acmconferencesArticle/Chapter ViewAbstractPublication PageskddConference Proceedingsconference-collections
research-article
Open access

Improving Robustness of Hyperbolic Neural Networks by Lipschitz Analysis

Published: 24 August 2024 Publication History

Abstract

Hyperbolic neural networks (HNNs) are emerging as a promising tool for representing data embedded in non-Euclidean geometries, yet their adoption has been hindered by challenges related to stability and robustness. In this work, we conduct a rigorous Lipschitz analysis for HNNs and propose using Lipschitz regularization as a novel strategy to enhance their robustness. Our comprehensive investigation spans both the Poincaré ball model and the hyperboloid model, establishing Lipschitz bounds for HNN layers. Importantly, our analysis provides detailed insights into the behavior of the Lipschitz bounds as they relate to feature norms, particularly distinguishing between scenarios where features have unit norms and those with large norms. Further, we study regularization using the derived Lipschitz bounds. Our empirical validations demonstrate consistent improvements in HNN robustness against noisy perturbations.

Supplemental Material

MP4 File - Promotional video for rtp1256
This promotional video gives a brief introduction to the paper, Improving Robustness of Hyperbolic Neural Networks by Lipschitz Analysis, co-authored by Yuekang Li, Yidan Mao, Yifei Yang, Dongmian Zou. Motivated by the challenge brought by the vulnerability of HNNs to noise perturbations, their contribution lies in deriving the Lipschitz bounds of HNN layers, using the Poincaré ball and hyperboloid model, for explicit expression and implementation; investigating the effects of the derived Lipschitz bounds concerning input features under various conditions; deriving simplified expressions for regularizing HNNs with normalized input features; and validating the effectiveness of Lipschitz regularization in enhancing the robustness of HNNs against noise.

References

[1]
Ola Ahmad and Freddy Lecue. 2022. FisheyeHDK: Hyperbolic Deformable Kernel Learning for Ultra-Wide Field-of-View Image Recognition. In 36th AAAI Conference on Artificial Intelligence.
[2]
James W Anderson. 2006. Hyperbolic geometry. Springer Science & Business Media.
[3]
Alexandre Araujo, Benjamin Negrevergne, Yann Chevaleyre, and Jamal Atif. 2021. On Lipschitz regularization of convolutional layers using toeplitz matrix theory. In 35th AAAI Conference on Artificial Intelligence.
[4]
Mina Ghadimi Atigh, Julian Schoep, Erman Acar, Nanne van Noord, and Pascal Mettes. 2022. Hyperbolic Image Segmentation. In Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition.
[5]
Gregor Bachmann, Gary Bécigneul, and Octavian Ganea. 2020. Constant curvature graph convolutional networks. In International Conference on Machine Learning.
[6]
Gary Becigneul and Octavian-Eugen Ganea. 2019. Riemannian Adaptive Optimization Methods. In International Conference on Learning Representations. https://openreview.net/forum?id=r1eiqi09K7
[7]
Ines Chami, Adva Wolf, Da-Cheng Juan, Frederic Sala, Sujith Ravi, and Christopher Ré. 2020. Low-Dimensional Hyperbolic Knowledge Graph Embeddings. In Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics. https://doi.org/10.18653/v1/2020.acl-main.617
[8]
Ines Chami, Zhitao Ying, Christopher Ré, and Jure Leskovec. 2019. Hyperbolic graph convolutional neural networks. Advances in neural information processing systems, Vol. 32 (2019), 4868--4879.
[9]
Weize Chen, Xu Han, Yankai Lin, Hexu Zhao, Zhiyuan Liu, Peng Li, Maosong Sun, and Jie Zhou. 2022. Fully Hyperbolic Neural Networks. In Proceedings of the 60th Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers). 5672--5686.
[10]
Mark Craven, Dan DiPasquo, Dayne Freitag, Andrew McCallum, Tom Mitchell, Kamal Nigam, and Seán Slattery. 2000. Learning to construct knowledge bases from the World Wide Web. Artificial intelligence, Vol. 118, 1--2 (2000), 69--113.
[11]
Jindou Dai, Yuwei Wu, Zhi Gao, and Yunde Jia. 2021. A Hyperbolic-to-Hyperbolic Graph Convolutional Network. In Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition.
[12]
George Dasoulas, Kevin Scaman, and Aladin Virmaux. 2021. Lipschitz normalization for self-attention layers with application to graph neural networks. In International Conference on Machine Learning.
[13]
N Benjamin Erichson, Dane Taylor, Qixuan Wu, and Michael W Mahoney. 2021. Noise-response analysis of deep neural networks quantifies robustness and fingerprints structural malware. In Proceedings of the 2021 SIAM International Conference on Data Mining (SDM). SIAM.
[14]
Xiran Fan, Chun-Hao Yang, and Baba C Vemuri. 2022. Nested Hyperbolic Spaces for Dimensionality Reduction and Hyperbolic NN Design. In Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition.
[15]
Alhussein Fawzi, Seyed-Mohsen Moosavi-Dezfooli, and Pascal Frossard. 2016. Robustness of classifiers: from adversarial to random noise. Advances in neural information processing systems, Vol. 29 (2016).
[16]
Mahyar Fazlyab, Alexander Robey, Hamed Hassani, Manfred Morari, and George Pappas. 2019. Efficient and accurate estimation of Lipschitz constants for deep neural networks. Advances in Neural Information Processing Systems, Vol. 32 (2019).
[17]
Fernando Gama, Joan Bruna, and Alejandro Ribeiro. 2020. Stability properties of graph neural networks. IEEE Transactions on Signal Processing, Vol. 68 (2020), 5680--5695.
[18]
Fernando Gama and Somayeh Sojoudi. 2022. Distributed linear-quadratic control with graph neural networks. Signal Processing, Vol. 196 (2022), 108506.
[19]
Octavian Ganea, Gary Bécigneul, and Thomas Hofmann. 2018. Hyperbolic neural networks. Advances in neural information processing systems, Vol. 31 (2018), 5345--5355.
[20]
Feng Gao, Guy Wolf, and Matthew Hirn. 2019. Geometric scattering for graph data analysis. In International Conference on Machine Learning.
[21]
Amirmasoud Ghiassi, Robert Birke, and Lydia Y Chen. 2023. Robust Learning via Golden Symmetric Loss of (un) Trusted Labels. In Proceedings of the 2023 SIAM International Conference on Data Mining (SDM). SIAM.
[22]
Ian J Goodfellow, Jonathon Shlens, and Christian Szegedy. 2015. Explaining and harnessing adversarial examples. In International Conference on Learning Representations.
[23]
Caglar Gulcehre, Misha Denil, Mateusz Malinowski, Ali Razavi, Razvan Pascanu, Karl Moritz Hermann, Peter Battaglia, Victor Bapst, David Raposo, Adam Santoro, and Nando de Freitas. 2019. Hyperbolic Attention Networks. In International Conference on Learning Representations. https://openreview.net/forum?id=rJxHsjRqFQ
[24]
Yunhui Guo, Haoran Guo, and Stella X Yu. 2022. Co-SNE: Dimensionality reduction and visualization for hyperbolic data. In Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition.
[25]
Yunhui Guo, Xudong Wang, Yubei Chen, and Stella X Yu. 2022. Clipped hyperbolic classifiers are super-hyperbolic classifiers. In Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition. 11--20.
[26]
Yujia Huang, Huan Zhang, Yuanyuan Shi, J Zico Kolter, and Anima Anandkumar. 2021. Training Certifiably Robust Neural Networks with Efficient Local Lipschitz Bounds. Advances in Neural Information Processing Systems, Vol. 34 (2021).
[27]
Yaning Jia, Dongmian Zou, Hongfei Wang, and Hai Jin. 2023. Enhancing Node-Level Adversarial Defenses by Lipschitz Regularization of Graph Neural Networks. In Proceedings of the 29th ACM SIGKDD Conference on Knowledge Discovery and Data Mining.
[28]
Matt Jordan and Alexandros G Dimakis. 2020. Exactly computing the local Lipschitz constant of relu networks. Advances in Neural Information Processing Systems, Vol. 33 (2020), 7344--7353.
[29]
Nicolas Keriven, Alberto Bietti, and Samuel Vaiter. 2020. Convergence and stability of graph convolutional networks on large random graphs. Advances in Neural Information Processing Systems, Vol. 33 (2020), 21512--21523.
[30]
Hyunjik Kim, George Papamakarios, and Andriy Mnih. 2021. The Lipschitz constant of self-attention. In International Conference on Machine Learning.
[31]
Alexey Kurakin, Ian J Goodfellow, and Samy Bengio. 2018. Adversarial examples in the physical world. In Artificial intelligence safety and security. Chapman and Hall/CRC, 99--112.
[32]
Fabian Latorre, Paul Rolland, and Volkan Cevher. 2020. Lipschitz constant estimation of neural networks via sparse polynomial optimization. In International Conference on Learning Representations.
[33]
Marc Law. 2021. Ultrahyperbolic neural networks. Advances in Neural Information Processing Systems, Vol. 34 (2021), 22058--22069.
[34]
Qi Liu, Maximilian Nickel, and Douwe Kiela. 2019. Hyperbolic Graph Neural Networks. Advances in Neural Information Processing Systems, Vol. 32 (2019), 8230--8241.
[35]
Emile Mathieu, Charline Le Lan, Chris J Maddison, Ryota Tomioka, and Yee Whye Teh. 2019. Continuous hierarchical representations with poincaré variational auto-encoders. Advances in neural information processing systems, Vol. 32 (2019).
[36]
Gal Mishne, Zhengchao Wan, Yusu Wang, and Sheng Yang. 2023. The numerical stability of hyperbolic representation learning. In International Conference on Machine Learning.
[37]
Galileo Namata, Ben London, Lise Getoor, Bert Huang, and U Edu. 2012. Query-driven active surveying for collective classification. In 10th International Workshop on Mining and Learning with Graphs, Vol. 8. 1.
[38]
Maximillian Nickel and Douwe Kiela. 2018. Learning continuous hierarchies in the lorentz model of hyperbolic geometry. In International Conference on Machine Learning.
[39]
Giannis Nikolentzos, Michail Chatzianastasis, and Michalis Vazirgiannis. 2023. Weisfeiler and Leman go Hyperbolic: Learning Distance Preserving Node Representations. In International Conference on Artificial Intelligence and Statistics. 1037--1054.
[40]
W. Peng, T. Varanka, A. Mostafa, H. Shi, and G. Zhao. 2021. Hyperbolic Deep Neural Networks: A Survey. IEEE Transactions on Pattern Analysis & Machine Intelligence (December 2021). https://doi.org/10.1109/TPAMI.2021.3136921
[41]
Eric Qu and Dongmian Zou. 2022. Lorentz Direct Concatenation for Stable Training in Hyperbolic Neural Networks. In NeurIPS 2022 Workshop on Symmetry and Geometry in Neural Representations.
[42]
Eric Qu and Dongmian Zou. 2023. Hyperbolic Convolution via Kernel Point Aggregation. arXiv preprint arXiv:2306.08862 (2023).
[43]
Frederic Sala, Chris De Sa, Albert Gu, and Christopher Ré. 2018. Representation tradeoffs for hyperbolic embeddings. In International Conference on Machine Learning.
[44]
Prithviraj Sen, Galileo Namata, Mustafa Bilgic, Lise Getoor, Brian Galligher, and Tina Eliassi-Rad. 2008. Collective classification in network data. AI magazine, Vol. 29, 3 (2008), 93--93.
[45]
Ali Shafahi, W Ronny Huang, Mahyar Najibi, Octavian Suciu, Christoph Studer, Tudor Dumitras, and Tom Goldstein. 2018. Poison frogs! targeted clean-label poisoning attacks on neural networks. Advances in neural information processing systems, Vol. 31 (2018).
[46]
Ryohei Shimizu, YUSUKE Mukuta, and Tatsuya Harada. 2021. Hyperbolic Neural Networks. In International Conference on Learning Representations.
[47]
Rishi Sonthalia and Anna Gilbert. 2020. Tree! I am no tree! I am a low dimensional hyperbolic embedding. Advances in Neural Information Processing Systems, Vol. 33 (2020), 845--856.
[48]
Christian Szegedy, Wojciech Zaremba, Ilya Sutskever, Joan Bruna, Dumitru Erhan, Ian Goodfellow, and Rob Fergus. 2014. Intriguing properties of neural networks. In International Conference on Learning Representations.
[49]
Jie Tang, Jimeng Sun, Chi Wang, and Zi Yang. 2009. Social influence analysis in large-scale networks. In Proceedings of the 15th ACM SIGKDD international conference on Knowledge discovery and data mining.
[50]
Matthieu Terris, Audrey Repetti, Jean-Christophe Pesquet, and Yves Wiaux. 2020. Building firmly nonexpansive convolutional neural networks. In ICASSP 2020--2020 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP). IEEE, 8658--8662.
[51]
Abraham Ungar. 2022. A gyrovector space approach to hyperbolic geometry. Springer Nature.
[52]
Aladin Virmaux and Kevin Scaman. 2018. Lipschitz regularity of deep neural networks: analysis and efficient estimation. Advances in Neural Information Processing Systems, Vol. 31 (2018).
[53]
Xiao Wang, Yiding Zhang, and Chuan Shi. 2019. Hyperbolic heterogeneous information network embedding. In Proceedings of the AAAI conference on artificial intelligence, Vol. 33.
[54]
Zhenxing Wu, Dejun Jiang, Chang-Yu Hsieh, Guangyong Chen, Ben Liao, Dongsheng Cao, and Tingjun Hou. 2021. Hyperbolic relational graph convolution networks plus: a simple but highly efficient QSAR-modeling method. Briefings in Bioinformatics, Vol. 22, 5 (2021), bbab112.
[55]
Tao Yu and Christopher M De Sa. 2019. Numerically accurate hyperbolic embeddings using tiling-based models. Advances in Neural Information Processing Systems, Vol. 32 (2019).
[56]
Yiding Zhang, Xiao Wang, Chuan Shi, Nian Liu, and Guojie Song. 2021. Lorentzian graph convolutional networks. In Proceedings of the Web Conference 2021. 1249--1261.
[57]
Dongmian Zou, Radu Balan, and Maneesh Singh. 2019. On Lipschitz bounds of general convolutional neural networks. IEEE Transactions on Information Theory, Vol. 66, 3 (2019), 1738--1759.
[58]
Dongmian Zou and Gilad Lerman. 2020. Graph convolutional neural networks via scattering. Applied and Computational Harmonic Analysis, Vol. 49, 3 (2020), 1046--1074.
[59]
Daniel Zügner, Amir Akbarnejad, and Stephan Günnemann. 2018. Adversarial attacks on neural networks for graph data. In Proceedings of the 24th ACM SIGKDD international conference on knowledge discovery & data mining.
[60]
Daniel Zügner, Oliver Borchert, Amir Akbarnejad, and Stephan Günnemann. 2020. Adversarial attacks on graph neural networks: Perturbations and their patterns. ACM Transactions on Knowledge Discovery from Data (TKDD), Vol. 14, 5 (2020), 1--31.

Recommendations

Comments

Information & Contributors

Information

Published In

cover image ACM Conferences
KDD '24: Proceedings of the 30th ACM SIGKDD Conference on Knowledge Discovery and Data Mining
August 2024
6901 pages
ISBN:9798400704901
DOI:10.1145/3637528
This work is licensed under a Creative Commons Attribution International 4.0 License.

Sponsors

Publisher

Association for Computing Machinery

New York, NY, United States

Publication History

Published: 24 August 2024

Check for updates

Author Tags

  1. hyperbolic neural networks
  2. lipschitz bounds
  3. noisy data
  4. robustness

Qualifiers

  • Research-article

Funding Sources

Conference

KDD '24
Sponsor:

Acceptance Rates

Overall Acceptance Rate 1,133 of 8,635 submissions, 13%

Upcoming Conference

KDD '25

Contributors

Other Metrics

Bibliometrics & Citations

Bibliometrics

Article Metrics

  • 0
    Total Citations
  • 322
    Total Downloads
  • Downloads (Last 12 months)322
  • Downloads (Last 6 weeks)78
Reflects downloads up to 15 Jan 2025

Other Metrics

Citations

View Options

View options

PDF

View or Download as a PDF file.

PDF

eReader

View online with eReader.

eReader

Login options

Media

Figures

Other

Tables

Share

Share

Share this Publication link

Share on social media