期刊文献+

一种改进的局部切空间排列算法 被引量:36

A Better Scaled Local Tangent Space Alignment Algorithm
下载PDF
导出
摘要 局部切空间排列算法(localtangentspacealignment,简称LTSA)是一种新的流形学习算法,能有效地学习出高维采样数据的低维嵌入坐标,但也存在一些不足,如不能处理样本数较大的样本集和新来的样本点.针对这些缺点,提出了一种基于划分的局部切空间排列算法(partitionallocaltangentspacealignment,简称PLTSA).它建立在VQPCA(vectorquantizationprincipalcomponentanalysis)算法和LTSA算法的基础上,利用X-均值算法把样本空间划分成一些相互有重叠的块,通过把样本点投影到它所在块的局部切空间上得到其局部低维坐标,对局部低维坐标施加平移、旋转、伸缩变换,求出整体低维坐标.PLTSA解决了VQPCA不能求出整体低维坐标和LTSA中大规模矩阵的特征值分解问题,且能够有效处理新来的样本点,这是很多流形学习算法所不能的.通过实验说明了PLTSA的有效性. Recently, a new manifold learning algorithm, LTSA (local tangent space alignment), has been proposed. It is efficient for many nonlinear dimension reduction problems but unfit for large data sets and newcome data. In this paper, an improved algorithm called partitional local tangent space alignment (PLTSA) is presented, which is based on VQPCA (vector quantization principal component analysis) and LTSA. In the algorithm, the sample space is first divided into overlapping blocks using the X-Means algorithm. Then each block is projected to its local tangent space to get local low-dimensional coordinates of the points in it. At last, the global low-dimensional embedded manifold is obtained by local affine transformations. PLTSA is better than VQPCA in that it gives the global coordinates of the data. It works on a much smaller optimization matrix than that of LTSA and leads to a better-scaled algorithm. The algorithm also provides a set of transformations that allow to calculate the global embedded coordinates of the newcome data. Experiments illustrate the validity of this algorithm.
出处 《软件学报》 EI CSCD 北大核心 2005年第9期1584-1590,共7页 Journal of Software
基金 国家自然科学基金 国家重点基础研究发展规划(973)~~
关键词 维数约简 流形学习 主成分分析 局部主成分分析 局部切空间排列 X-均值 dimensionality reduction manifold learning principal component analysis local principal component analysis local tangent space alignment X-means
  • 相关文献

参考文献14

  • 1Seung HS, Lee DD. The manifold ways of perception. Science, 2000,290(5500):2268-2269.
  • 2Donoho DL, Grimes C. Hessian Eigenmaps: New locally linear embedding techniques for high-dimensional data. Proc. of the National Academy of Sciences of the United States of American, 2003,100(10):5591-5596.
  • 3.[EB/OL].http://www.cse.msu.edu/~lawhiu/manifold/,.
  • 4Tenenbaum J, Silva VD, Langford J. A global geometric framework for nonlinear dimensionality reduction. Science, 2000,290(5500):2319-2323.
  • 5Roweis S, Saul L. Nonlinear dimensionality reduction by locally linear embedding. Science, 2000290(5500):2323-2326.
  • 6Belkin M, Niyogi P. Laplacian Eigenmaps for dimensionality reduction and data representation. Neural Computation, 2003,15(6):1373-1396.
  • 7Min WL, Lu L, He XF. Locality pursuit embedding. Pattern Recognition, 2004,37(4):781-788.
  • 8Zhang ZY, Zha HY. Principal manifolds and nonlinear dimensionalty reduction via tangent space alignment. SIAM Journal of Scientific Computing. 2004,26(1):313-338.
  • 9Kambhatla N, Leen TK. Dimension reduction by local principal component analysis. Neural Computation, 1997,9(7): 1493-1516.
  • 10Pelleg D, Moore A. X-means: Extending K-means with efficient estimation of the number of clusters.In: Langley P, ed. Proc. of the 17th Int'l Conf. on Machine Learning. San Francisco: Morgan Kaufmann Publishers, 2000. 727-734.

同被引文献550

引证文献36

二级引证文献194

相关作者

内容加载中请稍等...

相关机构

内容加载中请稍等...

相关主题

内容加载中请稍等...

浏览历史

内容加载中请稍等...
;
使用帮助 返回顶部