期刊文献+

一种基于局部保持的隐变量模型 被引量:2

A Latent Variable Model Based on Local Preservation
原文传递
导出
摘要 隐变量模型是一类有效的降维方法,但是由非线性核映射建立的隐变量模型不能保持数据空间的局部结构.为了克服这个缺点,文中提出一种保持数据局部结构的隐变量模型.该算法充分利用局部保持映射的保局性质,将局部保持映射的目标函数作为低维空间中数据的先验信息,对高斯过程隐变量中的低维数据进行约束,建立局部保持的隐变量.实验结果表明,相比原有的高斯过程隐变量,文中算法较好地保持数据局部结构的效果. Latent variable model (LVM) is a kind of efficient nonlinear dimensionality reduction algorithm through establishing smooth kernel mappings from the latent space to the data space. However, this kind of mappings cannot keep the points close in the latent space even they are close in data space. A LVM is proposed based on locality preserving projection (LPP) which can preserve the locality structure of dataset. The objective function of LPP is considered as a prior of the variables in the Gaussian process latent variable model (GP-LVM). The proposed locality preserving GP-LVM is built with the constrained term of the objective function. Compared with the traditional LPP and GP-LVM, experimental results show that the proposed method performs better in preserving local structure on common data sets.
出处 《模式识别与人工智能》 EI CSCD 北大核心 2010年第3期369-375,共7页 Pattern Recognition and Artificial Intelligence
基金 国家自然科学基金项目(No.60702061 6077106 60702061) 教育部长江学者和创新团队支持计划项目(No.IRT0645)资助
关键词 降维 隐变量模型(LVM) 局部距离保持 Dimensionality Reduction, Latent Variable Model (LVM), Local Distance Preservation
  • 相关文献

参考文献11

  • 1Bartholomew D J.Statistical Factor Analysis and Related Methods.New York,USA:Wiley,2004.
  • 2Tipping M E,Bishop C M.Probabilistic Principal Component Analysis.Journal of the Royal Statistical Society:Series B,1999,61(3):611 -622.
  • 3Sch(o)lkopf B,Smola A J.Müller K R.Nonlinear Component Analysis as a Kernel Eigenvalue Problem.Neural Computation,1998,10(5):1299 -1319.
  • 4Lin Tong,Zha Hongbin,Lee S.Riemannian Manifold Learning for Nonlinear Dimensionality Reduction//Proc of the European Confer ence on Computer Vision.Graz,Austria,2006:44-55.
  • 5Tenenbaum J B,de Silva V,Langford J C.A Global Geometric Framework for Nonlinear Dimensionality Reduction.Science,2000,290(5500):2319 -2323.
  • 6Lawrence N D.Probabilistic Non-Linear Principal Component Analysis with Gaussian Process Latent Variable Models.Journal of Machine Learning Research,2005,6:1783 -1816.
  • 7Lawrence N D.Gaussian Processs Models for Visualization for High Dimensional Data// Thrun S,Saul L K,Sch(o)lkopf B,eds.Advance in Neural Information Processing Systems.Cambridge,USA:MIT Press,2004,XVI:329-336.
  • 8Lawrence N D,Quinonero-Candela J.Local Distance Preservation in the GP-LVM through Back Constraints//Proc of the 23rd International Conference on Machine Learning.Pittsburgh,USA,2006:513 -520.
  • 9He Xiaofei,Niyogi P.Locality Preserving Projections//Thrun S,Saul L K,Sch(o)lkopf B,eds.Advances in Neural Information Processing Systems.Cambridge,USA:MIT Press,2004:626-632.
  • 10Roweis S T,Sau L K.Nonlinear Dimensionality Reduction by Locally Linear Embedding.Science,2000,290(5500):2323 -2326.

同被引文献9

引证文献2

二级引证文献2

相关作者

内容加载中请稍等...

相关机构

内容加载中请稍等...

相关主题

内容加载中请稍等...

浏览历史

内容加载中请稍等...
;
使用帮助 返回顶部