期刊文献+

基于局域主方向重构的适应性非线性维数约减 被引量:6

Locally adaptive nonlinear dimensionality reduction
下载PDF
导出
摘要 现有的主要非线性维数约减算法,如SIE和Isomap等,其邻域参数的设定是全局性的。仿真表明,对于局域流形结构差异较大的数据集,全局一致的邻域参数可能无法获得合理的嵌入结果。为此给出基于局域主方向重构的适应性邻域选择算法。算法首先为每个参考点选择一个邻域集,使各邻域集近似处于局域主线性子空间,并计算各邻域集的基向量集;再由基向量集对各邻域点的线性拟合误差判定该邻域点与主线性子空间的偏离程度,删除偏离较大的点。仿真表明,基于局域主方向重构的适应性邻域选择可有效处理局域流形结构差异较大的数据集;且相对于已有的适应性邻域选择算法,可以更好屏蔽靠近参考点的孤立噪声点及较大的空间曲率导致的虚假连通性。 Popular nonlinear dimensionality reduction algorithms, such as SIE and Isomap suffer a difficulty in common: global neighborhood parameters often fail in tackling data sets with high variation in local manifold. To improve the availability of nonlinear dimensionality reduction algorithms in the field of machine learning, an adaptive neighbors selection scheme based on locally principal direction reconstruction was proposed. The method involves two main computation steps. First, it selects an appropriate neighborhood set for each data points such that all neighbors in a neighborhood set form a d-dimensionality linear subspace approxlmatively and computes locally principal directions for each neighborhood set respectively. Secondly, it fits each neighbor by means of locally principal directions of corresponding neighborhood set and deletes the neighbors whose fitting error exceed a predefined threshold. The simulation show that the method can deal with data set with high variation in local manifold effcctively. Moreover, comparing with other adaptive neighbors selection strategy, this method can circumvent false connectivity introduced by noise or high local curvature.
出处 《计算机应用》 CSCD 北大核心 2006年第4期895-897,共3页 journal of Computer Applications
关键词 非线性维数约减 适应性邻域选择 局域主方向 流形学习 nonlinear dimensionality reduction adaptive neighbors selection locally principal direction manifold learning
  • 相关文献

参考文献10

  • 1BAR.LOW HB. Unsupervised learning[ J]. Neural Computation, 1989, 1 (3) : 295 - 311.
  • 2MARCUS G. Programs of the Mind[ J]. Science, 2004, 304(5676):1450 - 1451.
  • 3BAUM E. What Is Thought?[ M]. Cambridge, MA: MIT Press, 2004.
  • 4MARDIA KV, KENT JT, BIBBY JM. Multivariate Analysis[ M]. Academic Press, London, 1979.
  • 5TENENBAUM JB, et al. A Global Geometric Framework for Nonlinear Dimensionality Reduction[ J]. Science, 2000, 290( 12): 2319 -2323.
  • 6ROWELS ST, et al. Nonlinear Dimensionality Reduction by Locally Linear Embedding[J]. Science, 2000,290(12):2323 -2326.
  • 7DE SILVA V, TENENBAUM J. Global versus local methods in nonlinear dimensionality reduction[ A]. Neural Information Processing Systems 15(NIPS'2002) [ C]. 2002.
  • 8侯越先,丁峥,何丕廉.基于自组织的鲁棒非线性维数约减算法[J].计算机研究与发展,2005,42(2):188-195. 被引量:4
  • 9WANG J, ZHANG Z, ZHA H. Adaptive Manifold Learning[ A]. NIPS 2004[ C]. 2004.
  • 10BELKIN M, NIYOGI P. Laplacian Eigenmaps for Dimensionality Reduction and Data Representation [ J]. Neural Computation,2003, 15(6) : 1373 - 1396.

二级参考文献13

  • 1C. Gomes, B. Selman, N. Crato. Heavy-tailed distributions in combinatorial search. The 3rd Int'l Conf. of Constraint Programming (CP-97), Linz, Austria, 1997.
  • 2D.E. Knuth. The Art of Computer Programming. Boston:Addison-Wesley, 1998.
  • 3H. Kantz, T. Schrelber. Nonlinear time series analysis.Cambridge: Cambridge University Press, 1997.
  • 4Cormen, Leiserson, Rivest, Stein. Introduction to Algorithms,2nd edition. United States: MIT press, 2001.
  • 5David L. Donoho, et al. Hessian eigenmaps: Locally linear embedding techniques for high-dimensional data. Proceedings of the National Academy of Sciences, 2003, 100(10): 5591~ 5596.
  • 6Joshua B. Tenenbaum, et al. A global geometric framework for nonlinear dimensionality reduction. Science, 2000, 290(5500):2319~2323.
  • 7Sam T. Roweis, et al. Nonlinear dimensionality reduction by locally linear embedding. Science, 2000, 290 (5500): 2323 ~2326.
  • 8M. Belkin, et al. Laplacian eigenmaps and spectral techniques for embedding and clustering. NIPS, Vancouver, Canada, 2001.
  • 9V. de Silva, et al. Global versus local methods in nonlinear dimensionality reduction. NIPS, Whistler/Blackcomb, Canada,2002.
  • 10Y. Saad. Projection methods for solving large sparse eigenvalue problems. In: Matrix Pencils, Lect. Notes in Math, Vol 973.Berlin: Springer-Verlag, 1983. 121 ~ 144.

共引文献3

同被引文献64

引证文献6

二级引证文献47

相关作者

内容加载中请稍等...

相关机构

内容加载中请稍等...

相关主题

内容加载中请稍等...

浏览历史

内容加载中请稍等...
;
使用帮助 返回顶部