期刊文献+

一种基于Laplacian的半监督特征选择模型 被引量:1

A Semi-Supervised Feature Selection Framework Based on Laplacian
下载PDF
导出
摘要 针对LASSO算法及有关扩展模型忽略样本数据间关联信息的问题,以及有标签样本难以获取的实际情况,提出了一种半监督学习的特征选择模型。引入LASSO稀疏项,去除冗余特征,选择有效特征;引入Laplacian正则项,用于保留同类有标签和无标签样本内在的几何分布信息,帮助模型选出更具有判别能力的特征集;通过相似矩阵来重构半监督特征选择模型。在UCI数据集上的分类试验结果表明,这种方法能有效提高分类性能,同时也说明样本的几何分布信息是不应被忽略的。 The traditional feature selection methods often ignore the association information between sample data,and also it is difficult to obtain the labeled samples required in full supervised learning.A semi-supervised feature selection framework based on Laplacian is proposed(SLFS).The LASSO sparse term is introduced to remove redundant features and select effective features.Laplacian regularization term is introduced to preserve the intrinsic geometric distribution information of the same kind of labeled and unlabeled samples,and help obtain more discriminant features.The semi-supervised feature selection framework is reconstructed by similarity matrix.Experimental results on UCI datasets show that the proposed method can effectively improve the classification performance,and the geometric distribution information of samples should not be neglected.
作者 吴锦华 万家山 伍祥 霍清华 WU Jinhua;WAN Jiashan;WU Xiang;HUO Qinghua(College of Computer and Software Engineering,Anhui Institute of Information Technology,Wuhu Anhui 241000,China)
出处 《重庆科技学院学报(自然科学版)》 CAS 2019年第1期85-89,共5页 Journal of Chongqing University of Science and Technology:Natural Sciences Edition
基金 安徽省高校自然科学重点研究项目"基于稀疏理论和正则化项的特征选择方法研究"(KJ2017A799) 安徽省高校自然科学重点研究项目"基于CamShift方法多场景下的运动目标检测和跟踪技术的分析与研究"(KJ2018A0634) 安徽省高校自然科学重大研究项目"基于互动关系与深度学习的Web知识推送技术研究与实现"(KJ2017ZD53)
关键词 学习算法 特征选择 无标签样本 LASSO算法 正则化项 半监督 learning algorithm feature selection unlabeled samples LASSO algorithm regularization term semi-supervised learning
  • 相关文献

参考文献5

二级参考文献38

  • 1GUYON I, ELISSEEFF A. An introduction to variable and feature selection [ J]. Journal of Machine Learning Research, 2003, 3: 1157 - 1182.
  • 2HE X, CAI D, NIYOGI P. Laplacian score for feature selection [ EB/OL]. [ 2014-10-10]. http://people, cs. uchicago, edu/ niyogi/papersps/HeCaiNiylapscore, pdf.
  • 3YU L, LIU H. Feature selection for high-dimensional data: a fast correlation-based filter solution [ EB/OL]. [ 2014-10-10]. http:// citeseer, ist. psu. edu/viewdoc/summary?doi = 10, 1,1.68. 2975.
  • 4WESTON J, GUYON I. Support vector machine-recursive feature elimination (SVM-RFE) : US, US8095483 B2[ P]. 2010.
  • 5TIBSHIRANI R. Regression shrinkage and selection via the LASSO: a retrospective [ J]. Journal of the Royal Statistical Society, 2011, 73(3) : 273 -282.
  • 6PUDIL P, NOVOVICOVA1 J, KITFLER J. Floating search methods in feature selection [ J]. Pattern Recognition Letters, 1994, 15 (11): 1119-1125.
  • 7NG A Y. Feature selection, Ll vs. L2 regularization, and rotational invariance[ J]. International Conferences on Machine Learning, 2004, 19(5):379-387.
  • 8ZHOU J, LU Z, SUN J, et al. FeaFiner: biomarker identification from medical data through feature generalization and selection [ C]// KDD 2013: Proceedings of the 19th ACM SIGKDD International Conference on Knowledge Discovery and Data Mining. New York: ACM Press, 2013:1034 - 1042.
  • 9LIU F, WEE C Y, CHEN H. Inter-modality relationship constrain- ed muhi-modality multi-task feature selection for Alzheimer's diseaseand mild cognitive impairment identification [ J]. NeuroImage, 2014, 84:466-475.
  • 10ZOU H, HASTIE T. Regularization and variable selection via the elastic net [ J]. Journal of the Royal Statistical Society, 2005, 67 (2): 301 -320.

共引文献17

同被引文献7

引证文献1

二级引证文献6

相关作者

内容加载中请稍等...

相关机构

内容加载中请稍等...

相关主题

内容加载中请稍等...

浏览历史

内容加载中请稍等...
;
使用帮助 返回顶部