期刊文献+

Nonlinear Dimensionality Reduction by Local Orthogonality Preserving Alignment 被引量:2

Nonlinear Dimensionality Reduction by Local Orthogonality Preserving Alignment
原文传递
导出
摘要 We present a new manifold learning algorithm called Local Orthogonality Preserving Alignment (LOPA). Our algorithm is inspired by the Local Tangent Space Alignment (LTSA) method that aims to align multiple local neighborhoods into a global coordinate system using affine transformations. However, LTSA often fails to preserve original geometric quantities such as distances and angles. Although an iterative alignment procedure for preserving orthogonality was suggested by the authors of LTSA, neither the corresponding initialization nor the experiments were given. Procrustes Subspaces Alignment (PSA) implements the orthogonality preserving idea by estimating each rotation transformation separately with simulated annealing. However, the optimization in PSA is complicated and multiple separated local rotations may produce globally contradictive results. To address these difficulties, we first use the pseudo-inverse trick of LTSA to represent each local orthogonal transformation with the unified global coordinates. Second the orthogonality constraints are relaxed to be an instance of semi-definite programming (SDP). Finally a two-step iterative procedure is employed to further reduce the errors in orthogonal constraints. Extensive experiments products, and neighborhoods of the original datasets. In that of PSA and comparable to that of state-of-the-art significantly faster than that of PSA, MVU and MVE. show that LOPA can faithfully preserve distances, angles, inner comparison, the embedding performance of LOPA is better than algorithms like MVU and MVE, while the runtime of LOPA is We present a new manifold learning algorithm called Local Orthogonality Preserving Alignment (LOPA). Our algorithm is inspired by the Local Tangent Space Alignment (LTSA) method that aims to align multiple local neighborhoods into a global coordinate system using affine transformations. However, LTSA often fails to preserve original geometric quantities such as distances and angles. Although an iterative alignment procedure for preserving orthogonality was suggested by the authors of LTSA, neither the corresponding initialization nor the experiments were given. Procrustes Subspaces Alignment (PSA) implements the orthogonality preserving idea by estimating each rotation transformation separately with simulated annealing. However, the optimization in PSA is complicated and multiple separated local rotations may produce globally contradictive results. To address these difficulties, we first use the pseudo-inverse trick of LTSA to represent each local orthogonal transformation with the unified global coordinates. Second the orthogonality constraints are relaxed to be an instance of semi-definite programming (SDP). Finally a two-step iterative procedure is employed to further reduce the errors in orthogonal constraints. Extensive experiments products, and neighborhoods of the original datasets. In that of PSA and comparable to that of state-of-the-art significantly faster than that of PSA, MVU and MVE. show that LOPA can faithfully preserve distances, angles, inner comparison, the embedding performance of LOPA is better than algorithms like MVU and MVE, while the runtime of LOPA is
出处 《Journal of Computer Science & Technology》 SCIE EI CSCD 2016年第3期512-524,共13页 计算机科学技术学报(英文版)
基金 This work was supported by the National Basic Research 973 Program of China under Grant No. 2011CB302202, the National Natural Science Foundation of China under Grant Nos. 61375051 and 61075119, and the Seeding Grant for Medicine and Information Sciences of Peking University of China under Grant No. 2014-MI-21.
关键词 manifold learning dimensionality reduction senti-definite programming Procrustes measure manifold learning, dimensionality reduction, senti-definite programming, Procrustes measure
  • 相关文献

参考文献31

  • 1Tenenbaum ,J B, de Silva V, Langford J C. A global ge- ometric framework for nonlinear dimensionality reduction. Science, 2000. 290(5500): 2319-2323.
  • 2Roweis S T, Saul L K. Nonlinear dimensionality reduc- tion by locally linear embedding. Science, 2000, 290(5500): 2323-2326.
  • 3Saul L K. toweis S T. Think globally, fit locally: Unsuper- vised learning of low dimensional manifolds. The ,lournal of Machine LeaT"ning Research, 2003, 4: 119-155.
  • 4Seung H S, Lee D D. The manitbld ways of perception. Sci- ence. 2000, 290(5500): 2268-2269.
  • 5Donoho D L. High-dimensional data analysis: The curses and blessings of dimensionality. In Proc. AMS Mathemati- cal Challenges of the 21st Century, Aug. 2000.
  • 6Pei Y, Huang F, Shi F, Zha H. Unsupervised image match- ing based on manifold alignment. IEEE Trans. Pattern Analysis and Machine InteUigence ( TPAMI), 2012, 34(8): 1658-1664.
  • 7Zhang J, Huang H, Wang J. Manifold learning for visualiz- ing and analyzing high-dimensional data. IEEE Intelligent Systems, 2010, 25(4): 54-61.
  • 8Weinberger K Q, Saul L K. Unsupervised learning of im- age manifolds by semidefinite programming. International Journal of Computer Vision, 2006, 70(1): 77-90.
  • 9Weinberger K Q, Sha F, Saul L K. Learning a kernel matrix for nonlinear dimensionality reduction. In Proc. the'. 21st Int. Conf. Machine LeaT"ning, July 2004.
  • 10Shaw B, Jebara T. Minimum volume embedding. In Prvc. the l lth Int. Co*]. Artificial Intelligence and Statistics, Mar. 2007, pp.460-467.

同被引文献5

引证文献2

二级引证文献2

相关作者

内容加载中请稍等...

相关机构

内容加载中请稍等...

相关主题

内容加载中请稍等...

浏览历史

内容加载中请稍等...
;
使用帮助 返回顶部