期刊文献+

一种改进的局部切空间排列算法 被引量:2

Improved local tangent space alignment algorithm
下载PDF
导出
摘要 局部切空间排列(LTSA)算法是一种有效的流形学习算法,能较好地学习出高维数据的低维嵌入坐标。数据点的切空间在LTSA算法中起着重要的作用,其局部几何特征多是在样本点的切空间内表示。但是在实际中,LTSA算法是把数据点邻域的样本协方差矩阵的主元所张成的空间当做数据点的切空间,导致了在非均匀采样或样本邻域均值点与样本自身偏离程度较大时,原算法的误差增大,甚至失效。为此,提出一种更严谨的数据点切空间的计算方法,即数据点的邻域矩阵按照数据点本身进行中心化。通过数学推导,证明了在一阶泰勒展开的近似下,提出的计算方法所得到的空间即为数据点自身的切空间。在此基础上,提出了一种改进的局部切空间排列算法,并通过实验结果体现了该方法的有效性和稳定性。与已有经典算法相比,提出的计算方法没有增加任何计算复杂度。 As one of the classical manifold learning algorithms, LTSA algorithm can yield low-dimensional embedding coordi- nates from high-dimensional space effectively. Tangent space plays a central role in LTSA algorithm by projecting each neighbor- hood into the tangent space to obtain the local coordinates. However, in practice, LTSA algorithm takes the space which spanned by principal components of the sample covariance matrix of the neighborhood as the tangent space of the point. This paper pres- ented a more rigorous method to calculate tangent space, that the neighborhood matrix of data points was centralized in accord- ance with the data point itself. By mathematical deduction, it proved that, under the approximation of first order Taylor, the space attained by our method is even the tangent space of data points itself. Based on this method, it proposed an improved local tangent space alignment algorithm. The effectiveness and stability of this algorithm are further confirmed by some experiments. Moreover, the proposed algorithm has no increase in the computational complexity.
作者 顾艳春
出处 《计算机应用研究》 CSCD 北大核心 2013年第3期728-731,共4页 Application Research of Computers
关键词 流形学习 数据降维 局部切空间排列 切空间 协方差矩阵 manifold learning data reduction local tangent space alignment(LTSA) tangent spaces covariance matrix
  • 相关文献

参考文献16

  • 1SEUNG H S, LEE D D. Cognition: the manifold way of perception [ J ]. Science,2000,290 (5000) :2268- 2269.
  • 2ZHANG Tian-hao, TAO Da-cheng, LI Xue-long, et al. A unifying framework for spectral analysis based dimensionality reduction [ C ]// Proc of IEEE International Joint Conference on Neural Networks. 2008 : 1670- 1677.
  • 3Van der MAATEN L J P, POSTMA E O, Van den HERIK H J. Di- mensionality reduction : a comparative review [ J ]. Pattern Recogni- tion ,2007,10 : 1-41.
  • 4DONOHO D L. High dimensional data analysis: the curse and bless- ings of dimensionality [ C ]//Proc of American Math Society on Math Challenges of the 21 st Century. 2000.
  • 5TENENBAUM J B, SILVA V D, LANGFORD J C. A global geome- tric fi'amework for nonlinear dimensionality reduction [ J ]. Science, 2000,290(5000) :2219-2323.
  • 6ROWELS S T, SAUL L K. Nonlinear dimensionality reduction bv h)- tally linear embedding [ J ]. Science, 2000, 290 ( 5000 ) : 2323- 2326.
  • 7DONOHO D L, GRIMES C. Hessian eigenmaps: locally linear em- bedding techniques for high-dimensional data [ J ]. Proceedings of the National Academy of Sciences,2003,100(10) : 5591-5599.
  • 8ZHANG Z Y, ZHA H Y. Principal manifolds and nonlinear dimen- sion reduction via local tangent space alignment[J]. SLAM Journal of Scientific Computing ,2004,26( 1 ) :313-338.
  • 9BELKIN M, NIYOGI P. Laplacian eigenmaps ior dimentionatity re- duction and data representation [ J ]. Neural Computation,2003,15 (6) :1373-1396.
  • 10COIFMAN R R, LAFON S. Diffusion maps[J]. Applied and Com- putational Harmonic Analysis ,2006,21 ( 1 ) :5-30.

同被引文献17

引证文献2

二级引证文献4

相关作者

内容加载中请稍等...

相关机构

内容加载中请稍等...

相关主题

内容加载中请稍等...

浏览历史

内容加载中请稍等...
;
使用帮助 返回顶部