期刊文献+

基于局部与全局保持的半监督维数约减方法 被引量:25

Local and Global Preserving Based Semi-Supervised Dimensionality Reduction Method
下载PDF
导出
摘要 在很多机器学习和数据挖掘任务中,仅仅利用边信息(side-information)并不能得到最好的半监督学习(semi-supervised learning)效果,因此,提出一种基于局部与全局保持的半监督维数约减(local and global preserving based semi-supervised dimensionality reduction,简称LGSSDR)方法.该算法不仅能够保持正、负约束信息而且能够保持数据集所在低维流形的全局以及局部信息.另外,该算法能够计算出变换矩阵并较容易地处理未见样本.实验结果验证了该算法的有效性. In many machine learning and data mining tasks, it can't achieve the best semi-supervised learning result if only use side-information. So, a local and global preserving based semi-supervised dimensionality reduction (LGSSDR) method is proposed in this paper. LGSSDR algorithm can not only preserve the positive and negative constraints but also preserve the local and global structure of the whole data manifold in the low dimensional embedding subspace. Besides, the algorithm can compute the transformation matrix and handle unseen samples easily. Experimental results on several datasets demonstrate the effectiveness of this method.
作者 韦佳 彭宏
出处 《软件学报》 EI CSCD 北大核心 2008年第11期2833-2842,共10页 Journal of Software
基金 Supported by the Natural Science Foundation of Guangdong Province of China under Grant No.07006474 (广东省自然科学基金) the Sci & Tech Research Project of Guangdung Province of China under Grant No.2007B010200044 (广东省科技攻关项目)
关键词 边信息 局部与全局保持 半监督学习 维数约减 图嵌入 side-information local and global preserving semi-supervised learning dimensionality reduction graph embedding
  • 相关文献

参考文献25

  • 1Duda RO, Hart PE, Stork DG. Pattern Classification. 2nd ed., New York: John Wiley & Sons, 2001.
  • 2Turk MA, Pentland AP. Face recognition using eigenfaces. In: Proc. of the IEEE Conf. on Computer Vision and Pattern Recognition. Madison: IEEE Computer Society, 1991. 586-591.
  • 3Martinez AM, Kak AC. PCA Versus LDA. IEEE Trans. on Pattern Analysis and Machine Intelligence, 2001,23(2):228-233.
  • 4Zhu XJ. Semi-Supervised learning literature survey. Technical Report, 1530, Department of Computer Sciences, University of Wisconsin at Madison, 2006. http://www.cs.wisc.edu/-jerryzhu/pub/ssl_survey.pdf
  • 5Wagstaff K, Cardie C. Clustering with instance-level constraints. In: Proc. of the 17th Int'l Conf. on Machine Learning. San Francisco: Morgan Kaufmann Publishers, 2000. 1103-1110.
  • 6Klein D, Kamvar SD, Manning CD. From instance-level constraints to space-level constraints: Making the most of prior' knowledge in data clustering. In: Sammut C, Hoffmann AG, eds. Proc. of the 19th Int'l Conf. on Machine Learning. San Francisco: Morgan Kaufmann Publishers, 2002. 307-314.
  • 7Shental N, Hertz T, Weinshall D, Pavel M. Adjustment learning and relevant component analysis. In: Shental N, Hertz T, Weinshall D, Pavel M, eds. Proc. of the 7th European Conf. on Computer Vision. London: Springer-Verlag, 2002. 776-792.
  • 8Bar-Hillel A, Hertz T, Shental N, Weinshall D. Learning a Mahalanobis metric from equivalence constraints. Journal of Machine Learning Research, 2005,6(6):937-965.
  • 9Xing EP, Ng AY, Jordan MI, Russell S. Distance metric learning, with application to clustering with Side-information. In: Becker S, Thrun S, Obermayer K, eds. Advances in Neural Information Processing Systems 15. Cambridge: MIT Press, 2003. 505-512.
  • 10Tang W, Zhong S. Pairwise constraints-guided dimensionality reduction. In: Proc. of the 2006 SlAM Int'l Conf. on Data Mining Workshop on Feature Selection for Data Mining. 2006. 59-66.

同被引文献226

引证文献25

二级引证文献88

相关作者

内容加载中请稍等...

相关机构

内容加载中请稍等...

相关主题

内容加载中请稍等...

浏览历史

内容加载中请稍等...
;
使用帮助 返回顶部