期刊文献+

新的流形学习方法统一框架及改进的拉普拉斯特征映射方法 被引量:15

A Novel Unified Manifold Learning Framework and an Improved Laplacian Eigenmap
下载PDF
导出
摘要 流形学习是多个领域的重要研究课题.通过考察各种流形学习方法,提出了一种新的流形学习方法的统一框架,并在此框架下对拉普拉斯特征映射方法(Laplacian eigenmap,LE)进行了分析.进一步,基于此框架,提出了一种改进拉普拉斯特征映射方法(improved Laplacian eigenmap,ILE).它建立在LE方法和最大差异延展算法(maximum variance unfolding,MVU)的基础上,在保持流形谱图拉普拉斯特征的同时,以最大化任意两点之间的差异为目标.ILE有效地解决了拉普拉斯特征映射方法对邻域选择敏感以及MVU方法大计算量、局部限制过强等问题,且能够保持数据聚类性质,挖掘数据内蕴特征.通过实验说明了ILE的有效性. Manifold learning is crucial in many research fields, such as pattern recognition, data mining, computer version, etc. However, there is little work focusing on developing a common framework which can unify all approaches. Meanwhile, since Laplacian eigenmap (LE) is a local manifold learning approach, it is very sensitive to the size of neighbors. Considering all kinds of manifold learning approaches, a novel unified manifold learning framework is proposed in this paper. It consists of two functional items, i. e. , the maintaining item and the expecting item. Most approaches can be analyzed and improved within this framework. For illustration, LE is analyzed within the proposed framework. An improved Laplacian eigenmap (ILE) is then presented. It is mainly based on LE and maximum variance unfolding (MVU). The local character of graph Laplacian, which is referred to as maintaining item, is kept. The variances between any two points, which correspond to the expecting items, are maximized. ILE inherits the advantages of LE and MVU. Compared with LE, it is not so sensitive to the size of neighbors. And too strict local constraint of MVU is also relaxed. Moreover, ILE can also maintain the clustering property and discover the intrinsic character of original data. Several experiments on both toy examples and the real data sets are given for illustration.
出处 《计算机研究与发展》 EI CSCD 北大核心 2009年第4期676-682,共7页 Journal of Computer Research and Development
基金 国家"九七三"重点基础研究发展计划基金项目(2005CB321800) 国家自然科学基金项目(60673090) 国防科学技术大学优秀博士创新基金项目(B070201) 湖南省优秀博士创新基金项目~~
关键词 维数约简 流形学习 统一框架 拉普拉斯特征映射方法 最大差异延展算法 dimensionality reduction manifold learning unified framework Laplacian eigenmap maximum variance unfolding
  • 相关文献

参考文献13

  • 1Seung H S, Lee D D, The manifold ways of perception [J]. Science, 2000, 290(12): 2268-2269
  • 2Tenenbaum J, Silva V D. A global geometric framework for nonlinear dimensionality reduction [J]. Science, 2000, 290 (12) : 2319-2323
  • 3Saul L K, Roweis S T. Think globally, fit locally: Unsupervised learning of low dimensional manifolds [J]. Journal of Machine Learning Research, 2003, 4(6) : 119-155
  • 4Belkin M, Niyogi P. Laplacian eigenmaps for dimensionality reduction and data representation [J]. Neural Computation, 2003, 15(6): 1373-1396
  • 5Zhang Zhenyue, Zha Hongyuan. Principal manifolds and nonlinear dimensionality reduction via tangent space alignment SIAM [J]. Journal of Scientific Computing, 2004, 26(1):313-338
  • 6Weinberger K Q, Sha F, Saul L K. Learning a kernel matrix for nonlinear dimensionality reduction [C] //Proe of the 21st Int Conf on Machine Learning (ICML-04). New York:ACM, 2004: 839-846
  • 7罗四维,赵连伟.基于谱图理论的流形学习算法[J].计算机研究与发展,2006,43(7):1173-1179. 被引量:76
  • 8Yan Shuicheng, Xu Dong, et al. Graph embedding and extension: A general framework for dimensionality reduction [J]. IEEE Trans on Pattern Analysis and Machine Intelligence (TPAMI), 2007, 29(1) : 40-51
  • 9Lu Fan, Regularized nonparametric logistic regression and kernel regularization [D]. Madison, WI: Department of Statistics, University of Wisconsin, 2006
  • 10Beauer F, Pereverzev S, Rosasco L. On regularization algorithm in learning theory [J]. Journal of Complexity, 2007, 23(1): 52-57

二级参考文献43

  • 1张振跃,查宏远.线性低秩逼近与非线性降维[J].中国科学(A辑),2005,35(3):273-285. 被引量:8
  • 2杨剑,李伏欣,王珏.一种改进的局部切空间排列算法[J].软件学报,2005,16(9):1584-1590. 被引量:36
  • 3H. Sebastian Seung, Daniel D. Lee. The manifold ways of perception [J]. Science, 2000, 290(12): 2268-2269
  • 4Andrew Y. Ng, Michael I. Jordan, Yair Weiss. On spectral clustering: Analysis and an algorithm [G]. In: Advances in NIPS 14. Cambridge, MA: MIT Press, 2001. 849-856
  • 5J. Shawe-Taylor, N. Cristianini, J. Kandola. On the concentration of spectral properties [G]. In: Advances in NIPS 14. Cambridge, MA: MIT Press, 2001. 511-517
  • 6F. R. K. Chung. Spectral Graph Theory [M]. Fresno:American Mathematical Society. 1997
  • 7B. Scholkopf, A. Smola, K-R. Muller. Nonlinear component analysis as a kemel eigenvalue problem[J].Neural Computacation, 1998, 10(5): 1299-1319
  • 8T. Hastie, W. Stuetzle. Principal curves[J]. Journal of the American Statistical Association, 1989, 84(406) : 502-516
  • 9T. Cox, M. Cox. Multidimensional Scaling [M]. London:Chapman & Hall, 1994
  • 10J. B. Tenenbaum, V. de Silva, J. C. Langford. A global geometric framework for nonlinear dimensionality reduction [J].Science, 2000, 290(12): 2319-2323

共引文献75

同被引文献191

引证文献15

二级引证文献46

相关作者

内容加载中请稍等...

相关机构

内容加载中请稍等...

相关主题

内容加载中请稍等...

浏览历史

内容加载中请稍等...
;
使用帮助 返回顶部