期刊文献+

基于PAC-Bayes边界理论的SVM模型选择方法 被引量:2

Method of SVM model selection based on PAC-Bayes bound theory
下载PDF
导出
摘要 PAC-Bayes边界理论融合了贝叶斯定理和随机分类器的结构风险最小化原理,它作为一个理论框架,能有效评价机器学习算法的泛化性能。针对支持向量机(SVM)模型选择问题,通过分析PAC-Bayes边界理论框架及其在SVM上的应用,将PAC-Bayes边界理论与基于交叉验证的网格搜索法相结合,提出一种基于PAC-Bayes边界的SVM模型选择方法(PBB-GS),实现快速优选SVM的惩罚系数和核函数参数。UCI数据集的实验结果表明该方法优选出的参数能使SVM具有较高的泛化性能,并具有简便快速、参数选择准确的优点,能有效改善SVM模型选择问题。 PAC-Bayes risk bound integrating theories of Bayesian paradigm and structure risk minimization for stochastic classifiers has been considered as a framework for effective evaluating the generalization capability of machine learning algorithms. Aiming at the problem of model selection of SVM, this paper analyzes the theoretical framework of PAC-Bayes bound and its application to SVM, and combines the PAC-Bayes bound with grid search method based on cross validation.A method of model selection based on PAC-Bayes bound(PBB-GS)is put forward to select the best penalty parameter and kernel parameter rapidly. From the experimental results of the UCI datasets, it draws the conclusion that the parameters selected by PBB-GS can make SVM achieve better generalization performance, and this method is simple, fast and accurate, which can improve the model selection of SVM effectively.
出处 《计算机工程与应用》 CSCD 北大核心 2015年第6期27-32,共6页 Computer Engineering and Applications
基金 国家自然科学基金(No.61170177) 国家重点基础研究发展规划(973)(No.2013CB32930X) 天津大学创新基金 天津财经大学科研项目(No.Q1114)
关键词 概率近似正确性学习(PAC)-贝叶斯边界 支持向量机 模型选择 泛化性能 Probably Approximately Correct learning(PAC)-Bayes bound Support Vector Machine(SVM) model selection generalization capability
  • 相关文献

参考文献19

  • 1Vapnik V.The nature of statistical learning theory[M].Berlin:Springer,2000.
  • 2Valiant L.A theory of the learnable[J].Communications of the ACM,1984,27(11):1134-1142.
  • 3Mcallester D A.Some PAC-Bayesian theorems[J].Machine Learning,1999,37(3):355-363.
  • 4Langford J.Tutorial on practical prediction theory for classification[J].Journal of Machine Learning Research,2005,6:273-306.
  • 5Seeger M.PAC-Bayesian generalisation error bounds for Gaussian process classification[J].Journal of Machine Learning Research,2003,3(2):233-269.
  • 6Herbrich R,Graepel T.A PAC-Bayesian margin bound for linear classifiers[J].IEEE Transactions on Information Theory,2002,48(12):3140-3150.
  • 7Ambroladze A,Parrado-Hern E,Shawe-Taylor J.Tighter PAC-Bayes bounds[C]//Advances in Neural Information Processing Systems 19.Cambridge:MIT Press,2007:9-16.
  • 8Laviolette F,Marchand M.PAC-Bayes risk bounds for sample-compressed Gibbs classifiers[C]//Proc of the 22nd International Conference on Machine Learning.New York:ACM Press,2005:481-488.
  • 9Laviolette F,Marchand M.PAC-Bayes risk bounds for stochastic averages and majority votes of sample-compressed classifiers[J].Journal of Machine Learning Research,2007,8:1461-1487.
  • 10Seldin Y,Tishby N.PAC-Bayesian generalization bound for density estimation with application to co-clustering[C]//Proc of 12th International Conference on Artificial Intelligence and Statistics.Cambridge:MIT Press,2009:472-479.

同被引文献13

引证文献2

二级引证文献8

相关作者

内容加载中请稍等...

相关机构

内容加载中请稍等...

相关主题

内容加载中请稍等...

浏览历史

内容加载中请稍等...
;
使用帮助 返回顶部