期刊文献+

基于随机投影的正交判别流形学习算法 被引量:3

Manifold Learning Algorithms of Orthogonal Discriminant Based on Random Projection
下载PDF
导出
摘要 提出一种基于流形距离的局部线性嵌入算法,以流形距离测度数据间的相似度,选择各样本点的近邻域,解决了欧氏距离作为相似性度量时对邻域参数的敏感性.在MDLLE算法中引入最大边缘准则(maximum margin criterion,MMC)来构建最优平移缩放模型,使得算法在保持LLE局部几何结构的同时,具有MMC准则判别能力.通过正交化低维特征向量可消除降维过程中的噪声影响,进而提高算法的监督判别能力.由实验结果得到,所提出的方法具有良好的降维效果,能有效避免局部降维算法对邻域参数的敏感.随机投影独立于原始高维数据,将高维数据映射到一个行单位化的随机变换矩阵的低维空间中,维持映射与原始数据的紧密关系,从理论上分析证明了在流形学习算法中采用随机投影可以高概率保证在低维空间保持高维数据信息. A kind of locally linear embedding algorithms based on manifold distance,MDLLE was proposed. The similarity between data can be could measured based on the manifold distance and the neighbor domain of the sample points can be selected. This could solve the neighborhood parameter sensitivity of the Euclidean distance in similarity measure. The maximum margin criterion( MMC) is introduced in the MDLLE algorithm for constructing the optimal translation and scaling model. Thus,the algorithm both can both maintain local geometric structure of LLE and have discriminant ability of Maximum margin criterion. The low-dimensional feature vector of orthogonalization can eliminate noise effects in the process of dimension reduction,and improve the supervision and discriminant ability of the algorithm. The experimental result showed that this method had good dimension reduction effect and can effectively avoid sensitivity. Random projection is independent of the original high-dimensional data,which mapped the highdimensional data to a low-dimensional space of the random transformation matrix of line normalized. The theoretical analysis proved that the manifold learning algorithm of taking random projection could maintain high-dimensional data in low-dimensional space in high probability.
出处 《郑州大学学报(理学版)》 CAS 北大核心 2016年第1期102-109,115,共9页 Journal of Zhengzhou University:Natural Science Edition
基金 甘肃省科技支撑基金资助项目(1204GKCA038) 甘肃省财政厅基本科研业务费资助项目(213063)
关键词 流形学习算法 邻域选择 流形距离 正交判别 局部线性嵌入 随机投影 manifold learning neighborhood selection manifold distance similarity measure locally linear embedding random projection
  • 相关文献

参考文献23

  • 1ROWEIS S,SAUL L.Nonlinear dimensionality reduction by locally linear embedding[J].Science,2000,290(5500):2323-2326.
  • 2TENENBAUM J B,SILA A V,LANGFORD J C.A global geometric framework for nonlinear dimensionality reduction[J].Science,2000,290(5500):2319-2323.
  • 3孟德宇,徐宗本,戴明伟.一种新的有监督流形学习方法[J].计算机研究与发展,2007,44(12):2072-2077. 被引量:15
  • 4DONOHO D L,GRIMES C.Hessian eigenmaps:locally linear embedding techniques for high-dimensional data[J].Proc eedings of the national academy of sciences of USA,2003,100(10):5591-5596.
  • 5ZHANG Z Y,ZHA H Y.Principal manifolds and nonlinear dimension reduction via local tangent space alignment[J].SIAM journal:scientific computing,2005,26(1):313-338.
  • 6李锋,田大庆,王家序,杨荣松.基于有监督增量式局部线性嵌入的故障辨识[J].振动与冲击,2013,32(23):82-88. 被引量:7
  • 7LAW M H C,JAIN A K.Incremental nonlinear dimensionality reduction by manifold learning[J].IEEE transaction on pattern analysis and machine intelligence,2006,28(3):377-391.
  • 8ZHANG Z Y,WANG J,ZHA H Y.Adaptive manifold learning[J].IEEE transaction on pattern analysis and machine intelligence,2012,34(2):253-265.
  • 9侯越先,吴静怡,何丕廉.基于局域主方向重构的适应性非线性维数约减[J].计算机应用,2006,26(4):895-897. 被引量:6
  • 10蒲玲.自适应局部线性降维方法[J].计算机应用与软件,2013,30(4):255-257. 被引量:5

二级参考文献79

共引文献88

同被引文献23

  • 1贾小勇,徐传胜,白欣.最小二乘法的创立及其思想方法[J].西北大学学报(自然科学版),2006,36(3):507-511. 被引量:137
  • 2DUDA R O, HART P E, STORK D G. Pattern classification[ M]. New York: John Willey & Sons, 2004.
  • 3COVER T M, THOMAS J A. Elements of information theory[ M]. 2nd edition. New York: John Willey & Sons, 2003.
  • 4CAI D, ZHANG C, HE X. Unsupervised feature selection for multi-cluster data[ C]//Proceedings of the 16th ACM SIGKDD international conference on knowledge discovery and data mining. Washington, 2010:333 -342.
  • 5HE X, CAI D, NIYOGI P. Laplacian score for feature selection [ C ]//Advances in neural information processing systems. Co- lumbia, 2005 : 507 - 514.
  • 6PENG H, LONG F, DING C. Feature selection based on mutual information criteria of max-dependency, max-relevance, and min-redundancy [ J ]. Pattern analysis and machine intelligence, 1EEE transactions on, 2005, 27 (8) : 1226 - 1238.
  • 7YANG Y, SHEN H T, MA Z, et al. 12,1 norm regularized discriminative feature selection for unsupervised learning[ C ]//IJCAI Proceedings: international joint conference on artificial intelligence. Bacelona, 2011.
  • 8LI Z, YANG Y, LIU J, et al. Unsupervised feature selection using nonnegative spectral analysis[ C]//National conference on artificial intelligence. Toronto, 2012 : 1026 - 1032.
  • 9TENENBAUM J B, DE SILVA V, LANGFORD J C. A global geometric framework for nonlinear dimensionality reduction[ J]. Science, 2000, 290(5500): 2319-2323.
  • 10ROWELS S T, SAUL L K. Nonlinear dimensionality reduction by locally linear embedding[ J ]. Science, 2000, 290 (5500) :2323 - 2326.

引证文献3

二级引证文献6

相关作者

内容加载中请稍等...

相关机构

内容加载中请稍等...

相关主题

内容加载中请稍等...

浏览历史

内容加载中请稍等...
;
使用帮助 返回顶部