摘要
求解支撑向量机的二次规划有不同的变形.对于线性问题.从一个变形出发,利用 Lagrangian 对偶技巧,将特征空间的高维二次规划问题转化为输入空间的低维无约束、可微凸的对偶规划.针对目标函数的分片二次特征,结合快速精确的一维搜索技术,提出共轭梯度型支撑向量机来求解该问题.利用 Cholesky 分解或非完全(in-complete)Cholesky 分解方法分解核矩阵,在算法复杂度增加很少的条件下可实现基于核函数的非线性分类.该算法可以在普通计算机上快速求解上百万规模的线性训练问题和较大规模的非线性训练问题.大量数据实验和复杂度分析表明,该算法与同类算法如 ASVM、LSVM 相比是有效的.
Support vector machines can be posed as quadratic programming problems in various ways. Using the technology of the Lagrangian dual, an unconstrained differentiable convex program problem is proposed as the dual of the quadratic programming, which is a simple reformulation on standard quadratic program of a linear support vector machine. The resulting problem minimizes a differentiable convex piecewise quadratic function in the input space, but not in the feature space. By the characteristic of the piecewise quadratic function and the combination with the speedy line search method, a Conjugate Gradients Support Vector Machine (CGSVM) is pro- posed to solve the unconstraint program problem quickly. After kernel matrix is factorized by the Cholesky factorization or incomplete Cholesky factorization, nonlinear classification problem with kernel function can also be solved by CGSVM with little increase of the complexity of the algo- rithms. CGSVM can be used to solve linear classification problems with millions of points and the nonlinear classification problems with three thousand points or more on normal PC. Many numerical experiments and complexity analysis demonstrate that the proposed algorithms are very efficient compared with the similar algorithms such as ASVM and LSVM.
出处
《模式识别与人工智能》
EI
CSCD
北大核心
2006年第2期129-136,共8页
Pattern Recognition and Artificial Intelligence
基金
"十五"国家部委科技(电子)预研资助项目(No.413160501)