期刊文献+

基于稀疏学习的低秩属性选择算法 被引量:2

Low-rank feature selection algorithm based on sparse learning
下载PDF
导出
摘要 针对回归模型在进行属性选择未考虑类标签之间关系从而导致回归效果不理想,提出了一种新的具有鲁棒性的低秩属性选择算法。具体为,在线性回归的模型框架下,通过低秩约束来考虑类标签间的相关性和通过稀疏学习理论中的l_(2,p)-范数来考虑属性间的关联结构,以此去除不相关的冗余属性的影响;算法通过嵌入子空间学习方法(线性判别分析(LDA))来调整属性选择结果。经实验验证,提出的属性选择算法在六个公开数据集上的效果均优于四种对比算法。 The traditional regression model does not ensure to output good performance since it conducts feature selection without considering the correlation between labels.To address this issue,this paper proposes a novel robust low-rank feature selection method.Specifically,this paper considers the correlation between labels into a low-rank regression model and then employs an l2,p-norm regularization term to conduct feature selection.Meanwhile,this paper also considers subspace learning method(i.e.,Linear Discriminant Analysis(LDA))into the proposed feature selection model to adjust the result of feature selection.The iteration between feature selection and LDA enables to output optimal features until the algorithm converges.The experimental results on six public datasets show that the proposed feature selection method outperformed four comparison methods.
作者 胡荣耀 刘星毅 程德波 何威 HU Rongyao;LIU Xingyi;CHENG Debo;HE Wei(Guangxi Key Lab of Multi-source Information Mining & Security, Guangxi Normal University, Guilin, Guangxi 541004, China;Guangxi Collaborative Innovation Center of Multi-source Information Integration and Intelligent Processing, Guilin, Guangxi 541004, China;Qinzhou University, Qinzhou, Guangxi 535000, China)
出处 《计算机工程与应用》 CSCD 北大核心 2017年第10期132-138,共7页 Computer Engineering and Applications
基金 国家自然科学基金(No.61170131 No.61450001 No.61263035 No.61363009 No.61573270) 广西自然科学基金(No.2012GXNSFGA060004 No.2015GXNSFCB139011) 广西研究生教育创新计划项目(No.YCSZ2016046 No.XYCSZ2017064)
关键词 线性回归 线性判别分析 属性选择 子空间学习 稀疏学习 linear regression linear discriminant analysis feature selection subspace learning sparse learning
  • 相关文献

同被引文献8

引证文献2

二级引证文献4

相关作者

内容加载中请稍等...

相关机构

内容加载中请稍等...

相关主题

内容加载中请稍等...

浏览历史

内容加载中请稍等...
;
使用帮助 返回顶部