摘要
与统计学习理论结合,并把数据样本映射到高维空间,有时标准支持向量回归机运算速度和精度不理想.针对线性不可分的情况,在支持向量回归机目标函数中增加两个平方松弛项,这样可以减少两个约束条件.每个松弛项赋予不同的加权系数,可根据实际需要调节它们的权重.这种新算法称为新型加权支持向量回归机(weighted support vector regression machine,WSVRM),并把它用于函数逼近.实验结果表明,所提出的新型加权支持向量回归机具有良好的函数估计能力和数据预测能力.
Support vector regression machine(SVRM) is integrated with the statistics learning theory(SLT) to map training samples into a high dimension space.But sometimes the operation speed and the accuracy of the standard support vector regression machine is not ideal.For a case of linear indivisibility,two relaxation items are added into the objective function of the support vector regression machine in order to reduce two constraint conditions.The weights can be easily adjusted according to practical requirements by adding two weighting factors.The method is named a new weighed support vector regression machine(WSVRM) for function approximation.The experimental results show that the proposed new type of weighted support vector regression machine has good function estimation and data forecasting capabilities.
出处
《东北大学学报(自然科学版)》
EI
CAS
CSCD
北大核心
2011年第12期1684-1687,共4页
Journal of Northeastern University(Natural Science)
基金
国家自然科学基金资助项目(60843007
61050006)
关键词
统计学习理论
支持向量回归机
核函数
加权因子
函数逼近
statistical learning theory
support vector regression machine
kernel function
weighting factor
function approximation