期刊文献+

GGMC算法目标函数值实验分析与算法改进

Experimental analysis on object function value of GGMC algorithm and algorithm improvement
下载PDF
导出
摘要 针对贪心最大割图半监督学习算法(简称GGMC)计算复杂度较高的问题,提出一种改进的贪心最大割图半监督学习算法(简称GGMC-Estop)。依据对GGMC算法优化过程中目标函数变化趋势的实验分析,采取两种在迭代初期停止GGMC算法运行策略,继而通过一次标准的标签传播步骤预测图上所有样本的标记来实施对GGMC的改进。典型数据集的仿真实验结果表明,在取得相近分类性能的同时,改进算法在计算速度上有很大的提高。 Aiming at the problem of high computational complexity of Greed Max-Cut Graph semi-supervised learning algorithm(GGMC), an improved Greed Max-Cut Graph semi-supervised learning algorithm based on Early stopping strategy, called GGMC-Estop, is proposed. According to the experimental analysis on that object function value in optimi-zation procedure of GGMC, the algorithm is improved in which two early stopping strategies are applied to stop GGMC training and prediction. Standard propagation is used to predict the label of data over the whole graph in one step. Experi-mental results on typical data sets show that the computational amount using the improved algorithm is far less than that of using GGMC algorithm, while the performance of classification for these two algorithms is almost approximate.
出处 《计算机工程与应用》 CSCD 北大核心 2015年第12期111-117,188,共8页 Computer Engineering and Applications
基金 国家自然科学基金(No.71371012,No.71171002) 教育部人文社科规划项目(No.13YJA630098) 安徽省优秀青年人才基金重点项目(No.2013SQRL034Z) 安徽省高校省级科学研究项目(No.TSKJ2014B10) 安徽工程大学青年基金项目(No.2013YQ30)
关键词 图半监督学习 贪心最大割 早期停止策略 目标函数值 graph semi-supervised learning greedy max-cut early stopping strategy object function value
  • 相关文献

参考文献15

  • 1Zhu X J.Semi-supervised learning literature survey,Tech- nical Report 1530[R].Computer Sciences, University of Wisconsin-Madison, USA, 2005.
  • 2Olivier C,Bernhard S,Alexander Z.Semi-supervised learn- ing[M].[S.1.] : MIT Press, 2006.
  • 3Blum A,Chawla S.Learning from labeled and unlabeled data using graph mincuts[C]//Proceedings of the 18th Inter- national Conference on Machine Learning.Williamstorn, USA : Morgan Kaufmann Publisher, 2001 : 19-26.
  • 4Blum A, Lafferty J, Reddy R.Semi-supervised learning using randomized mincuts[C]//Proceedings of the 21st Interna- tional Conference on Machine Learning.Canada: Morgan Kaufmann Publisher,2004: 13-20.
  • 5Zhu X J, Ghahramni Z, Lafferty J.Semi-supervised learning using gaussian fields and harmonic functions[C]//Proceed- ings of the 20th International Conference on Machine Learning.Washington D C.,USA:Morgan Kaufmann Pub- lisher, 2003:912-919.
  • 6Zhou D,Bousqet O,Lal T,et al.Learning with local and global consistency[C]//Proceedings of the Conference on Neural Information Proceedings Systems.Cambridge,USA: MIT Press, 2004: 321-328.
  • 7Belkin M,Niyogi P,Sindhwani V.On manifold regulariza- tion[C]//Proceedings of the 10th International Workshop on Artificial Intelligence and Statistics.New Jersey,USA: Society tbr Artificial Intelligence and Statistics, 2005 :17-24.
  • 8Belkin M,Niyogi P,Sindhwani V.Manifold regularization: a geometric framework for learning from labeled and unlabeled examples[J].Journal of Machine Learning Research, 2006,7 : 2399-2434.
  • 9Wang F, Zhang C S.Label propagation through linear neigh- borhoods[J].IEEE Transactions on Knowledge and Data Engineering, 2008,20( 1 ) :55-67.
  • 10Wang J,Jebara T,Chang S F.Graph transduction via alter- nation minimization[C]//Proceedings of the 25th Annul International Conference on Machine Learning.USA: ACM,2008: 1144-1151.

二级参考文献14

  • 1Blum A, Chawla S. Learning from labeled and unlabeled data using graph mincuts. In: Proceedings of the 18th International Conference on Machine Learning. Williamstorn, USA: Morgan Kaufmann Publisher, 2001. 19-26.
  • 2Zhu X J, Ghahramani Z, Lafferty J. Semi-supervised learning using Gaussian fields and harmonic functions. In: Proceedings of the 20th International Conference on Machine Learning. Washington D. C., USA: Morgan Kaufmann Publisher, 2003. 912-919.
  • 3Zhou D, Bousquen O, Lal T N, Weston J, Scholkoph B. Learning with local and global consistency. In: Proceedings of the Conference on Neural Information Processing Systems. Cambridge, USA: MIT Press, 2004. 321-328.
  • 4Zhu X J, Lafferty J. Harmonic mixtures: combining mixture models and graph-based methods for inductive and scalable semi-supervised learning. In: Proceedings of the 22nd International Conference on Machine Learning. Bonn, Germany: ACM. 2005. 1052-1059.
  • 5Zhu X J. Semi-Supervised Learning Literature Survey, Technical Report 1530, Computer Sciences, University of Wisconsin-Madison, USA, 2005.
  • 6Zhou D, Scholkopf B. A regularization framework for learning from graph data. In: Proceedings of the 21st International Conference on Machine Learning. Banff, Canada: Morgan Kaufmann Publisher, 2004. 132-137.
  • 7Delalleau O, Bengio Y, Roux N L. Semi-Supervised Learning. Cambridge: MIT Press, 2006. 87-96.
  • 8Pfahringer B, Leschi C, Reutemann P. Scaling up semisupervised learning: an efficient and effective LLGC variant. In: Proceedings of the llth Pacific-Asia Conference on Knowledge Discovery and Data Mining. Nanjing, China: Springer, 2007. 236-247.
  • 9Zhou Z H, Ng M, She Q Q, Jiang Y. Budget semi-supervised learning. In: Proceedings of the 13th Pacific-Asia Conference on Advances in Knowledge Discovery and Data Mining. Bangkok, Thailand: Springer, 2009. 588-595.
  • 10Wang F, Zhang C S. Label propagation through linear neighborhoods. IEEE Transactions on Knowledge and Data Engineering, 2008, 20(1): 55-67.

共引文献5

相关作者

内容加载中请稍等...

相关机构

内容加载中请稍等...

相关主题

内容加载中请稍等...

浏览历史

内容加载中请稍等...
;
使用帮助 返回顶部