Tightening the Approximation Error of Adversarial Risk with Auto Loss Function Search
Abstract
References
Recommendations
Automated Detection System for Adversarial Examples with High-Frequency Noises Sieve
Cyberspace Safety and SecurityAbstractDeep neural networks are being applied in many tasks with encouraging results, and have often reached human-level performance. However, deep neural networks are vulnerable to well-designed input samples called adversarial examples. In particular, ...
A lightweight unsupervised adversarial detector based on autoencoder and isolation forest
AbstractAlthough deep neural networks (DNNs) have performed well on many perceptual tasks, they are vulnerable to adversarial examples that are generated by adding slight but maliciously crafted perturbations to benign images. Adversarial detection is an ...
Highlights- We observe that adversarial detection is sensitive to the perturbation level.
- We train a shallow autoencoder to find two key features from adversarial examples.
- We propose a lightweight and unsupervised adversarial detector.
Resisting Adversarial Examples via Wavelet Extension and Denoising
Smart Computing and CommunicationAbstractIt is well known that Deep Neural Networks are vulnerable to adversarial examples. An adversary can inject carefully-crafted perturbations on clean input to manipulate the model output. In this paper, we propose a novel method, WED (Wavelet ...
Comments
Information & Contributors
Information
Published In
![cover image ACM Conferences](/cms/asset/93f256a6-4e09-45ea-8b1d-51e8983225d6/3638530.cover.jpg)
Sponsors
Publisher
Association for Computing Machinery
New York, NY, United States
Publication History
Check for updates
Author Tags
Qualifiers
- Research-article
Funding Sources
Conference
Acceptance Rates
Contributors
Other Metrics
Bibliometrics & Citations
Bibliometrics
Article Metrics
- 0Total Citations
- 0Total Downloads
- Downloads (Last 12 months)0
- Downloads (Last 6 weeks)0
Other Metrics
Citations
View Options
Get Access
Login options
Check if you have access through your login credentials or your institution to get full access on this article.
Sign in