期刊文献+

基于不同惩罚系数的SMO改进算法

Improved SMO algorithm with different error costs
下载PDF
导出
摘要 为了解决Keerthi改进的序贯最小优化(SMO)算法在处理非平衡数据集时,整体分类性能较低、稳定性差等问题,对两个类别施加不同的惩罚系数的方法对算法作进一步改进,同时给出计算公式及算法步骤。实验结果表明,该算法不但提高了处理非平衡数据集的能力,也进一步提高了其稳定性。 When Keerthi's Sequential Minimal Optimization (SMO) algorithm is applied to the classification of unbalanced datasets, it not only leads to a poor classification performance but makes the result unstable, In order to overcome the difficulty, an improved SMO algorithm that used different error costs for different class was presented. Besides, the formula and the steps of the improved SMO algorithm were given. Experimental results show that our algorithm' s ability of dealing with unbalanced datasets can be improved and its stability also can be intensified.
作者 田大东 邓伟
出处 《计算机应用》 CSCD 北大核心 2008年第9期2369-2370,2374,共3页 journal of Computer Applications
基金 国家自然科学基金资助项目(60572074)
关键词 非平衡数据集 惩罚系数 序贯最小优化 unbalanced datasets error costs Sequential Minimal Optimization (SMO)
  • 相关文献

参考文献6

  • 1PLATT J C. Sequential minimal optimization: A fast algorithm for training support vector machines[ R]. Technical Reports MSR-TR- 98-14, 1998,
  • 2KEERTHI S S, SHEVADE S K, BHATTACHARYYA C, et al.. Improvements to platt' s SMO algorithm for SVM classifier design [ J]. Neural Computation, 2001, 13(3) : 637 -649.
  • 3VEROPOULOS K, CAMBELL C, CRISTIANINI N. Controlling the sensitivity of support vector machines[ C]//Proceedings of the International Joint Conference on AI, 1999:55 -60.
  • 4VAPNIK V N. Statistical learning theory[ M].许建华,张学工,译.北京:电子工业出版社,2004.
  • 5LEE L, LIN Y, WAHBA G. Multicategory supportvector machines [R]. Wisconsin: Technical Report 1040, Department of Statistics, University of Madison, 2001.
  • 6DAVID V, SANCHEZ A. Aadvanced support vector machines and kernel methods [ J], Neurocomputing, 2003(1/2) 55:5 - 20.

相关作者

内容加载中请稍等...

相关机构

内容加载中请稍等...

相关主题

内容加载中请稍等...

浏览历史

内容加载中请稍等...
;
使用帮助 返回顶部