期刊文献+

基于自组织的鲁棒非线性维数约减算法 被引量:4

Self-Organizing Isometric Embedding
下载PDF
导出
摘要 现有的非线性维数约减算法需要求解大尺度特征值问题 由于特征值问题至少二次的计算复杂性 ,这类算法在大样本集上的应用较受限制 此外 ,现有算法的全局优化机制对于噪声较为敏感 ,且需要考虑“病态矩阵”的计算精度问题 提出时间复杂性为O (NlogN)的自组织非线性维数约减算法SIE SIE的主要计算过程是局域的 ,可提高算法抗噪性、回避病态矩阵的计算精度问题 仿真表明 ,对于无噪数据和含噪数据 。 Most algorithms of nonlinear dimensionality reduction suffer some flaws in common. First, these algorithms need to solve large-scale eigenvalues problems or some variation of eigenvalues problems, which is usually of quadratic complexity of time. The quadratic complexity limits the applicability of algorithms in the case of large-size sampling sets, which are very prevalent in the applications of real world, e.g., texts clustering, images recognitions, biological information and so on. Second, current algorithms are global and analytic, which are often sensitive to noise and disturbed by ill-conditioned matrix. To overcome the above problems, a novel self-organizing nonlinear dimensionality reduction algorithm: SIE (self-organizing isometric embedding) is proposed. The time complexity of SIE is O(NlogN), which can bring a N/logN speedup against most current algorithms. Besides, the main computing procedure of SIE is based on locally self-organizing scheme, which improves the robustness of the algorithm remarkably and circumvents the trouble introduced by ill-conditioned matrix. The simulations demonstrate that the reconstructing quality of SIE is as optimal as the best global algorithm in the case of clean sampling sets and superior to global algorithms prominently in the case of noisy sampling sets.
出处 《计算机研究与发展》 EI CSCD 北大核心 2005年第2期188-195,共8页 Journal of Computer Research and Development
基金 天津市科技发展计划基金项目 (0 2 3 10 0 5 11)
关键词 非线性维数约减 自组织 鲁棒性 机器学习 nonlinear dimensionality reduction self-organizing robustness machine learning
  • 相关文献

参考文献13

  • 1David L. Donoho, et al. Hessian eigenmaps: Locally linear embedding techniques for high-dimensional data. Proceedings of the National Academy of Sciences, 2003, 100(10): 5591~ 5596.
  • 2Joshua B. Tenenbaum, et al. A global geometric framework for nonlinear dimensionality reduction. Science, 2000, 290(5500):2319~2323.
  • 3Sam T. Roweis, et al. Nonlinear dimensionality reduction by locally linear embedding. Science, 2000, 290 (5500): 2323 ~2326.
  • 4M. Belkin, et al. Laplacian eigenmaps and spectral techniques for embedding and clustering. NIPS, Vancouver, Canada, 2001.
  • 5V. de Silva, et al. Global versus local methods in nonlinear dimensionality reduction. NIPS, Whistler/Blackcomb, Canada,2002.
  • 6Y. Saad. Projection methods for solving large sparse eigenvalue problems. In: Matrix Pencils, Lect. Notes in Math, Vol 973.Berlin: Springer-Verlag, 1983. 121 ~ 144.
  • 7Dimitris K. Agrafiotis, et al. A self-organizing principle for learning nonlinear manifolds. PNAS Early Edition, 2002, 99 (25): 15869~ 15872.
  • 8K.V. Mardia, J. T. Kent, J. M. Bibby. Multivariate Analysis.London: Academic Press, 1979.
  • 9Mireille Boutin, Gregor Kemper. On reconstructing n-point configurations from the distribution of distances or areas. 2003.http://www. arxiv. org/abs/math. AC/0405313.
  • 10C. Gomes, B. Selman, N. Crato. Heavy-tailed distributions in combinatorial search. The 3rd Int'l Conf. of Constraint Programming (CP-97), Linz, Austria, 1997.

同被引文献39

引证文献4

二级引证文献14

相关作者

内容加载中请稍等...

相关机构

内容加载中请稍等...

相关主题

内容加载中请稍等...

浏览历史

内容加载中请稍等...
;
使用帮助 返回顶部