期刊文献+

偏置b对支持向量机分类问题泛化性能的影响 被引量:1

Influence of Bias b on Generalization Ability of SVM for Classification
下载PDF
导出
摘要 Poggio指出支持向量机(Support vector machine,SVM)中偏置b项是为了保证核函数的正定性,当使用的核函数为正定核时,b就不需要存在.为了验证b对SVM分类问题泛化性能的影响,研究了无bSVM的优化问题并给出了相应的有效集求解算法.通过XOR分类问题的实验研究得出约束条件N1yiαi=0会影响SVM得到最佳分类超平面.实验中的基准数据集包括了中小数据集、大规模数据集、高维数据集和多类分类数据集,并使用高斯正定核和多项式正定核作为核函数.基于26个标准数据集的实验表明无bSVM在分类问题中的计算代价要低于SVM,泛化性能要好于SVM.参数敏感性测试表明无bSVM对代价参数变化不太敏感,这使得无bSVM能在较少的参数值对中得到最佳测试精度. It has been pointed out by Poggio that the b term in support vector machine (SVM) is to guarantee the positive definitiveness of kernel and b is not needed if the used kernel is positive definite. To testify the role of b in the generalization ability of SVM for classification, optimization formulation of SVM without b is analyzed and the corresponding active set solution algorithm is proposed. By experiments on XOR classification problem, it can be concluded that SVM would fail to reach the optimum classification hyperplane due to the existence of constraint condition ∑N1yiαi=0. Small to medium data sets, large data sets, high-dimension data sets and mutli-class classification data sets are employed in the simulations as well as the Gaussian positive definite kernel and polynomial positive definite kernel are used. The experimental results on 26 benchmark data sets show that the computational cost of SVM without b is over that of SVM and the generalization performance is over SVM. SVM without b is less sensitive to cost parameter, and this makes SVM without b reaches the optimal testing rate with less parameters pair values.
出处 《自动化学报》 EI CSCD 北大核心 2011年第9期1105-1113,共9页 Acta Automatica Sinica
基金 国家高技术研究发展计划(863计划)(2008AA01Z136)资助~~
关键词 偏置 支持向量机 泛化性能 有效集 Bias, support vector machine (SVM), generalization ability, active set
  • 相关文献

参考文献4

二级参考文献53

  • 1YANG Hui,CHAI Tian-You.Component Content Soft-sensor Based on Neural Networks in Rare-earth Countercurrent Extraction Process[J].自动化学报,2006,32(4):489-495. 被引量:13
  • 2叶健,葛临东,吴月娴.一种优化的RBF神经网络在调制识别中的应用[J].自动化学报,2007,33(6):652-654. 被引量:32
  • 3Mardia K V, Kent J T, Bibby J M. Multivariate Analysis. New York: Academic Press, 1980.
  • 4Duda R O, Hart P E, Stork D G. Pattern Classification (Second Edition). New Jersey: Wiley Interscience, 2000.
  • 5Fukunaga K. Introduction to Statistical Pattern Recognition (Second Edition). New York: Academic Press, 1990.
  • 6Zhu X J, Ghahramani Z, Lafferty J. Semi-supervised learning using Gaussian fields and harmonic functions. In: Proceedings of the 12th International Conference on Machine Learning. Washington D. C., USA: ACM, 2003. 912-919.
  • 7Belkin M, Niyogi P, Sindhwani V. Manifold regularization: a geometric framework for learning from examples. Journal of Machine Learning Research, 2006, 7(11): 2399-2434.
  • 8Sindhwani V, Niyogi P, Belkin M. Beyond the point cloud: from transductive to semi-supervised learning. In: Proceedings of the 22nd International Conference on Machine Learning. Bonn, Germany: ACM, 2005. 824-831.
  • 9Zhou D, Bousquet O, Lal T N, Weston J, Scholkopf B. Learning with local and global consistency. Advances in Neural Information Processing Systems 16. Cambridge: MIT Press, 2003. 321-328.
  • 10Cai D, He X F, Han J W. Semi-supervised discriminant analysis. In: Proceedings of the llth IEEE International Conference on Computer Vision. Rio de Janeiro, Brazil: IEEE, 2007. 1-7.

共引文献143

同被引文献1

引证文献1

相关作者

内容加载中请稍等...

相关机构

内容加载中请稍等...

相关主题

内容加载中请稍等...

浏览历史

内容加载中请稍等...
;
使用帮助 返回顶部