摘要
支持向量机是一种基于核的学习方法,核函数及核参数的选择直接影响到SVM的泛化能力。传统的参数选择方法如网格搜索法,由于其计算量大,训练过程十分耗时,提出了一种新的快速选择最优核参数方法,该方法通过计算各类别在特征空间的可分性度量值来决定最优核参数,不需训练相应SVM分类模型,从而大大缩减了训练时间,提高了训练速度,且分类精度与传统方法相比,具有相当的竞争力。实验证明,该算法是可行有效的。
Support Vector Machine(SVM) is a kernel-based method,kernel function and kernel parameter selection directly affect SVM model’s generalization ability.A popular kernel parameter selection method is the grid search method.Large computation quantity of this method makes the training process time-consuming.This paper proposes the new method which using the Separability Measure(SM) between classes in the feature space to choose the kernel parameter.Calculating such SM costs much less computation time than training the corresponding SVM models,thus the best kernel parameter can be chosen much faster,and the testing accuracy of trained SVM by the proposed method is competitive to the standard ones.Experiment results show that the proposed method is feasible and effective.
出处
《计算机工程与应用》
CSCD
北大核心
2010年第15期165-168,共4页
Computer Engineering and Applications
关键词
支持向量机(SVM)
核参数选择
特征空间
可分性度量
Support Vector Machine(SVM)
kernel parameter selection
feature space
Separability Measure(SM)