期刊文献+

基于近邻传播算法的半监督聚类 被引量:165

Semi-Supervised Clustering Based on Affinity Propagation Algorithm
下载PDF
导出
摘要 提出了一种基于近邻传播(affinity propagation,简称AP)算法的半监督聚类方法.AP是在数据点的相似度矩阵的基础上进行聚类.对于规模很大的数据集,AP算法是一种快速、有效的聚类方法,这是其他传统的聚类算法所不能及的,比如:K中心聚类算法.但是,对于一些聚类结构比较复杂的数据集,AP算法往往不能得到很好的聚类结果.使用已知的标签数据或者成对点约束对数据形成的相似度矩阵进行调整,进而达到提高AP算法的聚类性能.实验结果表明,该方法不仅提高了AP对复杂数据的聚类结果,而且在约束对数量较多时,该方法要优于相关比对算法. A semi-supervised clustering method based on affinity propagation (AP) algorithm is proposed in this paper. AP takes as input measures of similarity between pairs of data points. AP is an efficient and fast clustering algorithm for large dataset compared with the existing clustering algorithms, such as K-center clustering. But for the datasets with complex cluster structures, it cannot produce good clustering results. It can improve the clustering performance of AP by using the priori known labeled data or pairwise constraints to adjust the similarity matrix. Experimental results show that such method indeed reaches its goal for complex datasets, and this method outperforms the comparative methods when there are a large number ofpairwise constraints.
作者 肖宇 于剑
出处 《软件学报》 EI CSCD 北大核心 2008年第11期2803-2813,共11页 Journal of Software
基金 Supported by the National Natural Science Foundation of China under Grant No.60875031(国家自然科学基金) the National Basic Research Program of China under Grant No.2007CB311002(国家重点基础研究发展计划(973)) the Program for New Century Excellent Talents in University of china under Grant No.NECT-06-0078(新世纪优秀人才支持计划) the Research Fund for the Doctoral Program of Higher Education of the Ministry of Education of China under Grant No.20050004008(教育部高等学校博士学科点专项科研基金) the Fok Ying-Tbng Education Foundation for Young Teachers in the Higher Education Instirutions of China under Grant No.101068(霍英东教育基金会高等院校青年教师基金)
关键词 半监督聚类 近邻传播 相似度矩阵 成对点约束 先验知识 semi-supervised clustering affinity propagation similarity matrix pairwise constraints prior knowledge
  • 相关文献

参考文献1

二级参考文献14

  • 1Yu SX, Shi J. Segmentation given partial grouping constraints. IEEE Trans. on Pattern Analysis and Machine Intelligence, 2004, 26(2): 173-183.
  • 2Hertz T, Shental N, Bar-Hillel A, Weinshall D. Enhancing image and video retrieval: Learning via equivalence constraint. In: Proc. of the IEEE Conf. on Computer Vision and Pattern Recognition. Madison: IEEE Computer Society, 2003.668-674.
  • 3Wagstaff K, Cardie C, Rogers S, Schroedl S. Constrained K-means clustering with background knowledge. In: Brodley CE, Danyluk AP, eds. Proc. of the 18th Int'l Conf. on Machine Learning. Williamstown: Morgan Kaufmann Publishers, 2001. 577-584.
  • 4Klein D, Kamvar SD, Manning CD. From instance-level constraints to space-level constraints: Making the most of prior knowledge in data clustering. In: Sammut C, Hoffmann AG, eds. Proc. of the 19th Int'l Conf. on Machine Learning. Sydney: Morgan Kaufmann Publishers, 2002. 307-314.
  • 5Wagstaff K, Cardie C. Clustering with instance-level constraints. In: Langley P, ed. Proc. of the 17th Int'l Conf. on Machine Learning. Morgan Kaufmann Publishers, 2000. 1103-1110.
  • 6Zhou D, Bousquet O, Lal TN, Weston J, Scholkopf B. Learning with local and global consistency. In: Thrun S, Saul L, SchSlkopf B, eds. Advances in Neural Information Processing Systems 16. Cambridge: MIT Press, 2004. 321-328.
  • 7Shi J, Malik J. Normalized cuts and image segmentation. IEEE Trans. on Pattern Analysis and Machine Intelligence, 2000,22(8): 888-905.
  • 8Gu M, Zha H, Ding C, He X, Simon H. Spectral relaxation models and structure analysis for k-way graph clustering and bi-clustering. Technical Report, CSE-01-007, Penn State University, 2001.
  • 9Ng AY, Jordan MI, Weiss Y. On spectral clustering: Analysis and an algorithm. In: Dietterich TG, Becker S, Ghahramani Z, eds. Advances in Neural Information Processing Systems (NIPS) 14. Cambridge: MIT Press, 2002, 894-856.
  • 10Meila M, Xu L. Multiway cuts and spectral clustering. Technical Report, 442, University of Washington, 2004.

共引文献94

同被引文献1407

引证文献165

二级引证文献966

相关作者

内容加载中请稍等...

相关机构

内容加载中请稍等...

相关主题

内容加载中请稍等...

浏览历史

内容加载中请稍等...
;
使用帮助 返回顶部