期刊文献+

一种基于受限约束范围标签传播的半监督学习算法

Semi-supervised learning algorithm based on label propagation in constrained range
下载PDF
导出
摘要 为了提高文本分类性能,提出一种基于受限约束范围标签传播的半监督学习算法。首先利用相似性矩阵计算得出概率转移矩阵,进而通过概率转移矩阵得出受限约束范围;然后在约束范围内利用半监督学习框架下的标签传播算法计算基于路径的相似性,路径相似性决定了标签传播的重要路径。由于只使用几条重要的传播路径,使得算法中省去计算每一条路径的相似度,计算复杂度大大减少。最终使得标签在带标签数据与未标签数据之间通过几条重要的路径之间传播。实验已经证明此算法的有效性。 This paper presented a semi-supervised learning algorithm based on label propagation in a constrained range. First of all, it obtained the probability transition matrix by calculating the similarity matrix, and then detected the constrained region. Then it adopted a label propagation algorithm under the semi-supervised learning framework to compute path similarity, which determined several important paths of label propagation. As only it calculated a few important propagation path, therefore greatly reduced the computational complexity. The labels spread in a few important paths between the labeled data and the unlabeled data. Experiments demonstrate the effectiveness of this algorithm.
出处 《计算机应用研究》 CSCD 北大核心 2016年第8期2303-2306,共4页 Application Research of Computers
基金 国家自然科学基金资助项目(61363058,61163039) 甘肃省青年科技基金资助项目(145RJYA259) 甘肃省自然科学基金资助项目(145RJZA232) 中国科学院计算技术研究所智能信息处理重点实验室开放基金资助项目(IIP2014-4) 西北师范大学2013年度青年教师科研能力提升计划项目(NWNU-LKQN-12-23)
关键词 概率转移矩阵 受限约束范围 标签传播 半监督学习算法 probability transition matrix constrained region label propagation semi-supervised learning algorithm
  • 相关文献

参考文献12

  • 1Zhu Xiaojin. Semi-supervised learning literature survey, TR-1530 [ R]. [ S. 1. I : University of Wisconsin-Madison,2008.
  • 2Wagsta K, Cardie C, Rogers S, et al. Constrained K-means clustering with background knowledge [ C ]//Proe of the 18th International Conference on Machine Learning. San Francisco: Morgan Kaufmann Publishers Inc ,2001:577-584.
  • 3Klein D, Kamvm" S D, Manning C D. From instance-level constraints to space-level constraints: making the most of prior knowledge in data clustering[ C ]//Proc of the 19th Intemational Conference on Machine Learning. San Francisco: Morgan Kaufmann Publishers Inc, 2002: 307- 314.
  • 4Shental N, Bar-hillel A, Hertz T, et al. Computing Gaussian mixture models with EM using equivalence constraints [ C ]//Advances in Neural Information Processing Systems. 2003:465-472.
  • 5何萍,徐晓华,陆林,陈崚.双层随机游走半监督聚类[J].软件学报,2014,25(5):997-1013. 被引量:12
  • 6Zhu Xiaojin, Ghahramani Z, Lafferty J. Semi-supervised learning using Gaussian fields and harmonic functions [ C ]//Proc of the 20th Inter- national Conference on Machine Learning. 2003:912-919.
  • 7Smola A J, Kondor R. Kernels and regularization on graphs [ C ]//Proc of Annual Conference on Learning Theory. Berlin: Springer, 2003: 144-158.
  • 8Fouss F, Pirotte A, Renders J M, et al. Random-walk computation of similarities between nodes of a graph with application to collaborative recommendation[ J]. IEEE Yrans on Knowledge Data Enginee- ring,2007,19 ( 3 ) :355- 369.
  • 9Fouss F, Franeoisse K, Yen L, et al. An experimental investigation of kernels on graphs for collaborative recommendation and semi-super- vised classification[J]. Neural Networks,2012,31 (7) :53-72.
  • 10Chapelle O, Weston J, SehSlkopf B. Cluster kernels for semi-super- vised learning[ C ]//Advances in Neural Information Processing Sys- tems. 2003.

二级参考文献24

  • 1Klein D, Kamvar SD, Manning CD. From instance-level constraints to space-level constraints: Making the most of prior knowledge in data clustering. In: Proc. of the 19th Int’l Conf. on Machine Learning. Madison: Omnipress, 2002. 307-314.
  • 2Basu S, Davidson I, Wagstaff KL. Constrained Clustering: Advances in Algorithms, Theory, and Applications. Boca Raton: Chapman and Hall/CRC Press, 2008.
  • 3Wagsta K, Cardie C, Rogers S, Schroedl S. Constrained k-means clustering with background knowledge. In: Proc. of the 18th Int’l Conf. on Machine Learning. Madison: Omnipress, 2001. 577-584.
  • 4Shental N, Bar-hillel A, Hertz T, Weinshall D. Computing Gaussian mixture models with EM using equivalence constraints. In: Advances in Neural Information Processing Systems 16. Cambridge: MIT Press, 2003. 465-472.
  • 5Xing EP, Ng AY, Jordan MI, Russell S. Distance metric learning, with application to clustering with side-information. In: Advances in Neural Information Processing Systems 15. Cambridge: MIT Press, 2002. 505-512.
  • 6Davis JV, Kulis B, Jain P, Sra S, Dhillon IS. Information-Theoretic metric learning. In: Proc. of the 24th Int’l Conf. on Machine Learning. Madison: Omnipress, 2007. 209-216 .
  • 7Bilenko M, Basu S, Mooney R. Integrating constraints and metric learning in semi-supervised clustering. In: Proc. of the 21th Int’l Conf. on Machine Learning. Madison: Omnipress, 2004. 81-88.
  • 8Kamvar SD, Klein D, Manning CD. Spectral learning. In: Proc. of the 17th Int’l Joint Conf. on Artificial Intelligence. San Francisco: Morgan Kaufmann Publishers, 2003. 561-566.
  • 9Yu SX, Shi JB. Segmentation given partial grouping constraints. IEEE Trans. on Pattern Analysis and Machine Intelligence, 2004, 26(2):173-180 .
  • 10Meila M, Shi JB. A random walks view of spectral segmentation. In: Proc. of the Int’l Workshop on AI and Statistics. Amsterdam: Elsevier Science & Technology Books, 2001. 873-879.

共引文献11

相关作者

内容加载中请稍等...

相关机构

内容加载中请稍等...

相关主题

内容加载中请稍等...

浏览历史

内容加载中请稍等...
;
使用帮助 返回顶部