Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                
×
Inspired by the success of the iterative magnitude pruning technique in finding lottery tickets of neural networks, we propose a new method—Sparser Random Feature Models via IMP (ShRIMP)1—to efficiently fit high-dimensional data with inherent low-dimensional structure in the form of sparse variable dependencies.
Dec 7, 2021 · Inspired by the success of the iterative magnitude pruning technique in finding lottery tickets of neural networks, we propose a new method -- ...
Inspired by the success of the iterative magnitude pruning technique in finding lottery tickets of neural networks, we propose a new method—Sparser Random ...
People also ask
Our method can be viewed as a combined process to construct and find sparse lottery tickets for two-layer dense networks. We explain the observed benefit of ...
It is shown that SHRIMP obtains better than or competitive test accuracy compared to state-of-art sparse feature and additive methods such as SRFE-S, SSAM, ...
Dec 7, 2021 · A detailed comparison of SHRIMP with sparse random feature models (Hashemi et al., 2021; Elesedy et al., 2020) and shrunk additive models ( ...
This repository is the official implementation of SHRIMP: Sparser Random Feature Models via Iterative Magnitude Pruning. Installation. Please run pip install -e ...
Yuege Xie, Robert Shi. SHRIMP: Sparser Random Feature Models via Iterative Magnitude Pruning. Proceedings of Mathematical and Scientific Machine Learning, ().