期刊文献+

基于假设间隔的弱随机特征子空间生成算法

Weak Random subspace Based On Simba
下载PDF
导出
摘要 集成算法是机器学习领域的研究热点。随机子空间算法是集成算法的一个主要算法。随机子空间生成的特征子集可能含有冗余特征、噪声特征,影响算法的分类精度。为此,本文提出了一种基于假设间隔的弱随机特征子空间生成算法(WRSSimba),有效去除了特征子集中冗余特征和噪声特征。在UCI数据集上的实验结果表明,WRSSimba的分类性能优于随机子空间算法和Simba算法。 The ensemble algorithm is a hot research field of machine learning. Random subspace algorithm is a main algorithm of ensemble algorithm. Feature subset generated by random subspace may contain redundant feature and even noise feature, affecting the classification accuracy. Therefore, in this paper, Weak Random subspace Based On Simba (WRSSimba) algorithm is introduced. WRSSimba effectively eliminates the redundancy and noise feature of feature subspace. The experimental results on UCI datasets show that, WRSSimba classification performance is better than Random subspace algorithm.
作者 李志亮 黄丹
出处 《贵阳学院学报(自然科学版)》 2012年第3期1-10,共10页 Journal of Guiyang University:Natural Sciences
关键词 集成学习 随机子空间 假设间隔 Ensemble Learning Random subspace Simba
  • 相关文献

参考文献18

  • 1Y. L. Zhu, J. Liu, S. Chen. Semi-random subspace method for face recognition. Image and Vision Computing, 27(29), 1358 -1370, 2009.
  • 2T. G.. Dietterich, Ensemble Methods in Machine Learn- ing. First International Workshop on Multiple Classifier Sys- tems, 1 - 15,2000.
  • 3M. Skurichina , R. P. W. Duin. Bagging, boosting and the random subspace method for linear classifiers. Pattern Analysis & Applications, 5(2), 121 - 135, 2002.
  • 4M. Yang, F. Wang, P. Yang. A novel feature selection algorithm based on hypothesis - margin. Journal of comput- ers, 3(12), 27-34, 2008.
  • 5Blake C, Keogh E, Merz CJ. UCI repository of machine learning databases. Irvine: University of Califorrfia, http:// www. ics. uci. edu/-mlearn/MLRepository, html, 1998.
  • 6X. Wang , X. TanS. Random sampling for subspaee face recognition. International Journal of Computer Vision (IJCV), 70(1), 91 -104, 2006.
  • 7R. Giladbachrach, A. Navot, N. Tishby. Margin Based Feature Selection - Theory and Algorithms. ICML (04), 2004.
  • 8Y. Freund, R. E. Schapire. A decision- theoretic gener- alization of on - line learning and an application to boosting [ J ]. Journal of Computer and System Sciences, 119 -139, 1997.
  • 9Robert E. Schapire. The Strength of Weak Learnability. Machine Learning, 5(2), 197 -227, 1990.
  • 10Y. Freund , R. E. Schapire. A Short Introduction to Boos- ting[ J ]. Journal of Japanese Society for Artificial Intelli- gence, 14(5):771-780, 1999.

共引文献1

相关作者

内容加载中请稍等...

相关机构

内容加载中请稍等...

相关主题

内容加载中请稍等...

浏览历史

内容加载中请稍等...
;
使用帮助 返回顶部