Improving Robustness of Hyperbolic Neural Networks by Lipschitz Analysis
Abstract
Supplemental Material
- Download
- 4.46 MB
References
Index Terms
- Improving Robustness of Hyperbolic Neural Networks by Lipschitz Analysis
Recommendations
Capacity bounds for hyperbolic neural network representations of latent tree structures
AbstractWe study the representation capacity of deep hyperbolic neural networks (HNNs) with a ReLU activation function. We establish the first proof that HNNs can ɛ-isometrically embed any finite weighted tree into a hyperbolic space of dimension d at ...
Convergence and robustness of bounded recurrent neural networks for solving dynamic Lyapunov equations
AbstractRecurrent neural networks have been reported as an effective approach to solve dynamic Lyapunov equations, which widely exist in various application fields. Considering that a bounded activation function should be imposed on recurrent ...
Robustness of neural networks: a probabilistic and practical approach
ICSE-NIER '19: Proceedings of the 41st International Conference on Software Engineering: New Ideas and Emerging ResultsNeural networks are becoming increasingly prevalent in software, and it is therefore important to be able to verify their behavior. Because verifying the correctness of neural networks is extremely challenging, it is common to focus on the verification ...
Comments
Information & Contributors
Information
Published In
Sponsors
Publisher
Association for Computing Machinery
New York, NY, United States
Publication History
Check for updates
Author Tags
Qualifiers
- Research-article
Funding Sources
Conference
Acceptance Rates
Upcoming Conference
Contributors
Other Metrics
Bibliometrics & Citations
Bibliometrics
Article Metrics
- 0Total Citations
- 322Total Downloads
- Downloads (Last 12 months)322
- Downloads (Last 6 weeks)78
Other Metrics
Citations
View Options
Login options
Check if you have access through your login credentials or your institution to get full access on this article.
Sign in