Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                
×
Nov 7, 2018 · Low-rank constraint is used to learn a suitable dictionary to preserve the subspace structures of data samples. Meanwhile, the sparse constraint ...
This is a PDF file of an unedited manuscript that has been accepted for publication. As a service to our customers we are providing this early version of ...
This paper introduces a novel and effective unsupervised feature selection method via multiple graph fusion and feature weight learning (MGF^2WL) to address ...
In this paper, we propose a low-rank structure preserving method for unsupervised feature selection (LRPFS) to address this shortcoming. The data matrix ...
People also ask
In this paper, we propose an efficient unsupervised feature selection algorithm, which incorporates low-rank approximation as well as structure learning. First, ...
Jun 21, 2021 · Dictionary learning in a low-rank representation not only enables us to provide a new representation, but it also maintains feature correlation.
Jun 16, 2022 · In this paper, we introduce a novel unsupervised feature selection approach by applying dictionary learning ideas in a low-rank representation.
May 23, 2023 · A sparse regression model based on latent low-rank representation with the symmetric constraint for unsupervised feature selection is proposed.
In this paper, we present a general multi-view unsupervised feature selection model which integrates the common graph learning and feature selection into a ...
Aug 11, 2023 · In this paper, we propose a sparse low-rank approximation of matrix and local preserving model for unsupervised image feature selection.