期刊文献+

基于邻域保持学习的无监督特征选择算法 被引量:8

Unsupervised Feature Selection Algorithm Based on Neighborhood Preserving Learning
下载PDF
导出
摘要 近邻法对不相关特征的敏感性很高,利用邻域重构系数可以保持原有数据结构的优点,为此,文中提出基于邻域保持学习的无监督特征选择算法.首先根据数据样本和邻域的相似性构造相似矩阵,并引入中间矩阵构造低维空间.然后利用拉普拉斯乘子法选择有效特征子集.在4个公开数据集上的实验表明,文中算法可以有效识别代表性特征. Since the sensitivity of neighborhood method for irrelevant features is high, an unsupervised feature selection algorithm based on neighborhood preserving learning(NPL) is proposed by utilizing the reconstruction coefficient of neighborhood to maintain the original data structure. Firstly, according to the similarity of each data and its neighborhood, the similarity matrix is constructed and a low dimensional space is built by introducing a mid-matrix. Secondly, an effective feature subset is selected by the Laplace multiplier method. Finally, the proposed algorithm is compared with six state-of-the-art feature selection methods on four publicly available datasets. Experimental results show the proposed method effectively identifies the representative features.
作者 刘艳芳 叶东毅 LIU Yanfang;YE Dongyi(College of Mathematics and Information Engineering, Longyan University, Longyan 364012;College of Mathematics and Computer Science, Fuzhou University, Fuzhou 350116)
出处 《模式识别与人工智能》 EI CSCD 北大核心 2018年第12期1096-1102,共7页 Pattern Recognition and Artificial Intelligence
基金 国家自然科学基金项目(No.61502104) 福建省中青年教师教育科研项目(科技类)(No.JAT170577) 龙岩学院"百名青年教师攀登项目"(No.LQ2015031 LQ2014010)资助~~
关键词 聚类分析 邻域保持 特征选择 无监督学习 Clustering Analysis Neighborhood Preserving Feature Selection Unsupervised Learning
  • 相关文献

参考文献1

二级参考文献26

  • 1张莉,孙钢,郭军.基于K-均值聚类的无监督的特征选择方法[J].计算机应用研究,2005,22(3):23-24. 被引量:29
  • 2Boutcmedjet S, Bouguila N, Ziou D. A Hybrid Feature Extraction Selection Approach for High-Dimensional Non-Gaussian Data Cluste- ring. IEEE Trans on Pattern Analysis and Machine Intelligence, 2009, 31 (8) : 1429-1443.
  • 3Boutsidis C, Mahoney M W, Drineas P. Unsupervised Feature Selection for Principal Components Analysis // Proe of the 14th ACM SIGKDD International Conference on Knowledge Discovery and Data Mining. Las Vegas, USA, 2008:61-69.
  • 4Cai D, He X F, Han J W. Sparse Projections over Graph//Proe of the 2lst AAAI Conference on Artifieial Inte|Iigence. Chicago, USA, 2008:610-615.
  • 5Xu Z L, King I, Lyu M R T, et al. Discriminative Semi-Supervised Feature Selection via Manifold Regularization. IEEE Trans on Neu- ral Networks, 2010, 21 (7) : 1033-1047.
  • 6He X F, Cai D, Niyogi P.' Laplacian Score for Feature Selection// Proc o: the Advances in Neural lnfonnat:,n Process:,r,g Systems lg. Vancouver, Canada, 2005:507-514.
  • 7Cai D, Zhang C Y, He X F. Unsupervised Feature Selection for Multi-cluster Data// Proc of the 16th ACM SIGKDD International Conference on Knowledge Discovery and Data Mining. Washing- ton, USA, 2010:333-342.
  • 8Li Z C, Yang Y, Liu J, et al. Unsupervised Feature Selection Using Nonnegative Spectral Analysis // Proc of the 26th AAAI Conference on Artificial Intelligence. Toronto, Canada, 2012 : 1026-1032.
  • 9Zhu X F, Huang Z, Yang Y, et al. Self-Taught Dimensionality Reduction on the High-Dimensional Small-Sized Data. Pattern Recognition, 2013, 46(1 ): 215-229.
  • 10He X F, Niyogi P. D.:cality Preserving Projections// Proe of the Advances in Neural Information Processing Systems 16. Vancouver, Canada, 2003 : 153-160.

共引文献7

同被引文献48

引证文献8

二级引证文献28

相关作者

内容加载中请稍等...

相关机构

内容加载中请稍等...

相关主题

内容加载中请稍等...

浏览历史

内容加载中请稍等...
;
使用帮助 返回顶部