期刊文献+

基于核稀疏表示的属性选择算法

A feature selection algorithm based on kernel sparse representation
下载PDF
导出
摘要 为解决高维数据在分类时造成的“维数灾难”问题,提出一种新的将核函数与稀疏学习相结合的属性选择算法。具体地,首先将每一维属性利用核函数映射到核空间,在此高维核空间上执行线性属性选择,从而实现低维空间上的非线性属性选择;其次,对映射到核空间上的属性进行稀疏重构,得到原始数据集的一种稀疏表达方式;接着利用L 1范数构建属性评分选择机制,选出最优属性子集;最后,将属性选择后的数据用于分类实验。在公开数据集上的实验结果表明,该算法能够较好地实现属性选择,与对比算法相比分类准确率提高了约3%。 In order to solve the“dimension disaster”problem caused by high-dimensional data classification,the paper proposes a new feature selection algorithm combining kernel function with sparse learning.Specifically,the kernel function is firstly used to map every dimensional feature to the kernel space,and linear feature selection is performed in the high dimensional kernel space to achieve nonlinear feature selection in the low dimensional space.Secondly,sparse reconstruction is performed on the features mapped to the kernel space,so as to gain a sparse representation of the original dataset.Next,L 1-norm is used to construct a feature selection mechanism and selects the optimal feature subset.Finally,the data after the feature selection is used in the classification experiments.Experimental results on public datasets show that,compared with the comparison algorithm,the proposed algorithm can conduct the feature selection better and improve the classification accuracy by about 3%.
作者 吕治政 李扬定 雷聪 Lü Zhi-zheng;LI Yang-ding;LEI Cong(College of Computer Science and Information Engineering,Guangxi Normal University,Guilin 541004,China)
出处 《计算机工程与科学》 CSCD 北大核心 2020年第1期167-177,共11页 Computer Engineering & Science
基金 国家重点研发计划(2016YFB1000905) 国家自然科学基金(6117013120) 国家973项目(2013CB329404) 中国博士后科学基金(2015M570837) 广西自然科学基金(2015GXNSFCB139011)
关键词 属性选择 非线性 核函数 稀疏学习 L 1范数 feature selection nonlinear kernel function sparse learning L 1-norm
  • 相关文献

参考文献1

二级参考文献17

  • 1DUDA R O,HART P E,STORK D G.Pattern classification[M].2nded.New York:Wiley,2000.
  • 2DASH M,LIU H.Feature selection for classification[J].IntelligentData Analysis,1997,1(3):131-156.
  • 3BISHOP C M.Neural networks for pattern recognition[M].NewYork:Oxford University Press,1995.
  • 4HE Xiao-fei,CAI Deng,NIYOGI P.Laplacian score for feature selec-tion[C]//Advances in Neural Information Processing Systems.Cam-bridge,MA:MIT Press,2005:507-514.
  • 5ZHANG Dao-qiang,CHEN Song-can,ZHOU Zhi-hua.Constraintscore:a new filter method for feature selection with pairwise con-straints[J].Pattern Recognition,2008,41(5):1440-1451.
  • 6ZHAO Zheng,LIU Huan.Semi-supervised feature selection via spec-tral analysis[C]//Proc of the 7th SIAM International Conference onData Mining.2007.
  • 7LIANG Yi-xiong,WANG Lei,XIANG Yao,et al.Feature selection viasparse approximation for face recognition[C]//Proc of Computer Sci-ence and Pattern Recognition.2011.
  • 8MALLAT S G,ZHANG Zhi-feng.Matching pursuits with time-frequen-cy dictionaries[J].IEEE Trans on Signal Processing,1993,41(12):3397-3415.
  • 9DONOHO D L.Compressed sensing[J].IEEE Trans on InformationTheory,2006,52(4):1289-1306.
  • 10GAO Sheng-hua,TSANG I W H,CHIA L T.Kernel sparse representa-tion for image classification and face recognition[C]//Proc of the11th European Conference on Computer Vision.Berlin:Springer-Ver-lag,2010.

共引文献2

相关作者

内容加载中请稍等...

相关机构

内容加载中请稍等...

相关主题

内容加载中请稍等...

浏览历史

内容加载中请稍等...
;
使用帮助 返回顶部