期刊文献+
共找到1篇文章
< 1 >
每页显示 20 50 100
Dimensionality reduction via kernel sparse representation 被引量:3
1
作者 zhisong pan zhantao deng yibing wang yanyan zhang 《Frontiers of Computer Science》 SCIE EI CSCD 2014年第5期807-815,共9页
Dimensionality reduction (DR) methods based on sparse representation as one of the hottest research topics have achieved remarkable performance in many applications in recent years. However, it's a challenge for ex... Dimensionality reduction (DR) methods based on sparse representation as one of the hottest research topics have achieved remarkable performance in many applications in recent years. However, it's a challenge for existing sparse representation based methods to solve nonlinear problem due to the limitations of seeking sparse representation of data in the original space. Motivated by kernel tricks, we proposed a new framework called empirical kernel sparse representation (EKSR) to solve nonlinear problem. In this framework, non- linear separable data are mapped into kernel space in which the nonlinear similarity can be captured, and then the data in kernel space is reconstructed by sparse representation to preserve the sparse structure, which is obtained by minimiz- ing a ~1 regularization-related objective function. EKSR pro- vides new insights into dimensionality reduction and extends two models: 1) empirical kernel sparsity preserving projec- tion (EKSPP), which is a feature extraction method based on sparsity preserving projection (SPP); 2) empirical kernel sparsity score (EKSS), which is a feature selection method based on sparsity score (SS). Both of the two methods can choose neighborhood automatically as the natural discrimi- native power of sparse representation. Compared with sev- eral existing approaches, the proposed framework can reduce computational complexity and be more convenient in prac- tice. 展开更多
关键词 feature extraction feature selection sparse rep-resentation kernel trick
原文传递
上一页 1 下一页 到第
使用帮助 返回顶部