期刊文献+

LLE方法的分类与研究 被引量:2

Classification and Research of LLE Method
下载PDF
导出
摘要 对于低维数据的分类很常见,但是对于高维数据的分类却不多,主要是因为维度太高.尤其对于分布不均匀的样本集,传统的局部线性嵌入算法易受到近邻点个数的影响,为了克服这一问题,提出改进距离的局部线性嵌入算法.通过实验表明,改进距离的局部线性嵌入算法能使原来的样本集尽可能的分布均匀,从而降低近邻点个数的取值对局部线性嵌入的影响,在保证分类准确的前提下,达到了有效缩短时间的目的. For classification of low-dimensional data is very common, but not for the classification of high-dimensional data, mainly because of too high dimension. In particular, for the uneven distribution of the sample set, the traditional locally linear embedding(LLE) algorithm is vulnerable to the impact of the number of nearest neighbor points, In order to overcome this problem, this paper improves locally linear embedding algorithm by changing the distance. Through the experiments indicates that the improved distance locally linear embedding algorithm can make the original sample set distribute evenly as far as possible, thereby reducing the influence of selection of the number of nearest neighbor points on locally linear embedding, on the premise of ensuring accurate classification, to achieve the purpose of effectively shorten the time.
作者 屈治礼
出处 《计算机系统应用》 2013年第4期14-17,50,共5页 Computer Systems & Applications
关键词 局部线性嵌入 高维数据 分类 locally linear embedding high-dimensional data classification
  • 相关文献

参考文献14

  • 1Hotelling H. Analysis of a complex statistical variable into principal components. Journal of Educational Psychology, 1933,24:417-441.
  • 2Tenenbaum JB, Silva V, Landford JC. A global geometric framework for nonlinear dimensionality reduction. Science, 2000,290(22):2319-2323.
  • 3Scholkopf B, Smola AJ, Muller KR. Nonlinear component analysis as a kernel eigenvalue problem Neural Computation, 1998,10(5): 1299-1319.
  • 4Zhang ZY, Zha HY. Principal manifolds and nonlinear dimensionality reduction via local tangent space alignment. SIAM Journal of Scientific Computing, 2004,26(1):313-338.
  • 5Roweis ST, Saul LK. Nonlinear dimensionality reduction by locally linear embedding.Science,2000,290(5500):2323-2326.
  • 6Roptevao K, Okuno P. Incremental locally linear embedding. Pattem Recognition,2005,38( 10): 1764 - 1767.
  • 7罗四维,赵连伟.基于谱图理论的流形学习算法[J].计算机研究与发展,2006,43(7):1173-1179. 被引量:76
  • 8Saul L, Roweis S. Think Globally, Fit locally: unsupervised learning of low dimensional manifolds. Journal of Machine Learning Reserrch, 2003,2:119-155.
  • 9He X, Niyogip. Locality preserving projections.Address in Neural information Processing Systems. Cambridge: MIT Press,2003:291-299.
  • 10Verleyse M. Learning high-dimension data. In: Ablameyko S, et al. eds. Limitations and Future Trends in Neural Computation. Amsterdam, The Netherlands: IOS Press, 2003: 141-162.

二级参考文献87

  • 1张振跃,查宏远.线性低秩逼近与非线性降维[J].中国科学(A辑),2005,35(3):273-285. 被引量:8
  • 2杨剑,李伏欣,王珏.一种改进的局部切空间排列算法[J].软件学报,2005,16(9):1584-1590. 被引量:36
  • 3陈维恒 李兴校.黎曼几何引论[M].北京:北京大学出版社,2002..
  • 4H. Sebastian Seung, Daniel D. Lee. The manifold ways of perception [J]. Science, 2000, 290(12): 2268-2269
  • 5Andrew Y. Ng, Michael I. Jordan, Yair Weiss. On spectral clustering: Analysis and an algorithm [G]. In: Advances in NIPS 14. Cambridge, MA: MIT Press, 2001. 849-856
  • 6J. Shawe-Taylor, N. Cristianini, J. Kandola. On the concentration of spectral properties [G]. In: Advances in NIPS 14. Cambridge, MA: MIT Press, 2001. 511-517
  • 7F. R. K. Chung. Spectral Graph Theory [M]. Fresno:American Mathematical Society. 1997
  • 8B. Scholkopf, A. Smola, K-R. Muller. Nonlinear component analysis as a kemel eigenvalue problem[J].Neural Computacation, 1998, 10(5): 1299-1319
  • 9T. Hastie, W. Stuetzle. Principal curves[J]. Journal of the American Statistical Association, 1989, 84(406) : 502-516
  • 10T. Cox, M. Cox. Multidimensional Scaling [M]. London:Chapman & Hall, 1994

共引文献91

同被引文献20

引证文献2

二级引证文献1

相关作者

内容加载中请稍等...

相关机构

内容加载中请稍等...

相关主题

内容加载中请稍等...

浏览历史

内容加载中请稍等...
;
使用帮助 返回顶部