摘要
将超平面偏置项平方加入到最小二乘支持向量回归机(LSSVMR)的目标函数中,提出直接支持向量回归机(DSVMR)。该方法增强了求解问题的凸性,与LSSVMR相比,只需要求解一个与核矩阵类似的对称正定矩阵的逆就可以得到问题的解,再使用Cholesky分解和SMW(Sherman-Morrison-Woodbury)求逆公式,降低了计算复杂度,加快了学习速度,而且逼近能力与LSSVMR近乎相同。最后数值试验表明DSVMR可行且完全具有上述优势。
A direct support vector machine for regression (DSVMR) is obtained when appending the square bias of the hyper-plane into the object function in a least square support vector machine for regression (LSSVMR). Compared with LSSVMR, the DSVMR strengthens the convexity of the problem to be solved and only needs to calculate the inversion of a symmetrical positive definite matrix which is similar to a kernel matrix. The ChoIesky decomposing and the SMW(Sherman- Morrison-Woodbury) formula are employed to reduce the computing amount greatly, accelerates the learning speed and has no weakening approximate ability. Numerical experiments show that the new method is feasible and has the above advantages.
出处
《系统工程与电子技术》
EI
CSCD
北大核心
2009年第1期178-181,共4页
Systems Engineering and Electronics
基金
国家自然科学基金资助课题(60574075
60705004)
关键词
支持向量机
直接支持向量机
回归
正定矩阵
support vector maehine
direct support vector machine
regression
positive definite matrix